Login to MyKarger

New to MyKarger? Click here to sign up.



Login with Facebook

Forgot your password?

Authors, Editors, Reviewers

For Manuscript Submission, Check or Review Login please go to Submission Websites List.

Submission Websites List

Institutional Login
(Shibboleth or Open Athens)

For the academic login, please select your country in the dropdown list. You will be redirected to verify your credentials.

Invited Review

Free Access

Defining Standards in Experimental Microsurgical Training: Recommendations of the European Society for Surgical Research (ESSR) and the International Society for Experimental Microsurgery (ISEM)

Tolba R.H.a · Czigány Z.b · Osorio Lujan S.c · Oltean M.d · Axelsson M.e · Akelina Y.f · Di Cataldo A.g · Miko I.h · Furka I.h · Dahmen Ui · Kobayashi E.j · Ionac M.k · Nemeth N.h

Author affiliations

aInstitute for Laboratory Animal Science and Experimental Surgery, RWTH Aachen University, Aachen, Germany
bDepartment of Surgery and Transplantation, RWTH Aachen University, Aachen, Germany
cDepartment of Cardiology, Heart Institute, Children’s Hospital Colorado, Denver, CO, USA
dThe Transplantation Institute, Sahlgrenska University Hospital, Gothenburg, Sweden
eDepartment of Biological and Environmental Sciences, University of Gothenburg, Gothenburg, Sweden
fMicrosurgery Research and Training Laboratory, Department of Orthopedic Surgery, Columbia University, New York, NY, USA
gDepartment of Oncological Surgery, University of Catania, Catania, Italy
hDepartment of Operative Techniques and Surgical Research, Faculty of Medicine, University of Debrecen, Debrecen, Hungary
iExperimental Transplantation Surgery, Department of General, Visceral, and Vascular Surgery, University Hospital Jena, Jena, Germany
jDepartment of Organ Fabrication, Keio University School of Medicine, Tokyo, Japan
kClinic of Vascular Surgery and Reconstructive Microsurgery, Victor Babes University of Medicine and Pharmacy, Timisoara, Romania

Corresponding Author

René H. Tolba, MD, PhD

Institute for Laboratory Animal Science and Experimental Surgery, RWTH Aachen University, Pauwelsstrasse 30

DE–52074 Aachen (Germany)

E-Mail rtolba@ukaachen.de

Norbert Nemeth, MD, PhD

Department of Operative Techniques and Surgical Research, Faculty of Medicine, University of Debrecen, Moricz Zs. krt. 22

HU–4032 Debrecen (Hungary)

E-Mail nemeth@med.unideb.hu

Related Articles for ""

Eur Surg Res 2017;58:246–262

Abstract

Background: Expectations towards surgeons in modern surgical practice are extremely high with minimal complication rates and maximal patient safety as paramount objectives. Both of these aims are highly dependent on individual technical skills that require sustained, focused, and efficient training outside the clinical environment. At the same time, there is an increasing moral and ethical pressure to reduce the use of animals in research and training, which has fundamentally changed the practice of microsurgical training and research. Various animal models were introduced and widely used during the mid-20th century, the pioneering era of experimental microsurgery. Since then, high numbers of ex vivo training concepts and quality control measures have been proposed, all aiming to reduce the number of animals without compromising quality and outcome of training. Summary: Numerous microsurgical training courses are available worldwide, but there is no general agreement concerning the standardization of microsurgical training. The major aim of this literature review and recommendation is to give an overview of various aspects of microsurgical training. We introduce here the findings of a previous survey-based analysis of microsurgical courses within our network. Basic principles behind microsurgical training (3Rs, good laboratory practice, 3Cs), considerations around various microsurgical training models, as well as several skill assessment tools are discussed. Recommendations are formulated following intense discussions within the European Society for Surgical Research (ESSR) and the International Society for Experimental Microsurgery (ISEM), based on scientific literature as well as on several decades of experience in the field of experimental (micro)surgery and preclinical research, represented by the contributing authors. Key Messages: Although ex vivo models are crucial for the replacement and reduction of live animal use, living animals are still indispensable at every level of training which aims at more than just a basic introduction to microsurgical techniques. Modern, competency-based microsurgical training is multi-level, implementing different objective assessment tools as outcome measures. A clear consensus on fundamental principles of microsurgical training and more active international collaboration for the sake of standardization are urgently needed.

© 2017 S. Karger AG, Basel


Introduction

The main purpose of using animal models is to reproduce disease processes, to test biomedical devices or compounds, and to help provide answers concerning issues of disease pathogenesis, prevention, and treatment in human and veterinary medicine. However, animal models are also fundamental in the training of biomedical scientists and even more important for the preclinical training of future surgeons. While training in large animals is similar to the surgical technique used in humans, experimental microsurgery is substantially different from macrosurgery; it requires specific surgical (micro-)instruments, very fine suture materials, optical magnification, and perfect hand-eye coordination in a very small, indirect field of view.

Microsurgical courses are organized in various parts of the world in order to help scientists and surgeons acquire the necessary microsurgical skills. Due to ethical and scientific issues and new legislation on animal research, leading centers developed nonliving training models allowing simulation of the key procedures as well as structured microsurgical training guidelines, integrating the modern principles of quality control. However, besides positive developments in some centers, there is still no international consensus on microsurgical training. Most of the teams are still using their “well-established” in-house protocols often without any objective validation. There are also no competency-defined criteria available which could be followed when marking a threshold for progression between different levels of training [1].

In June 2015 a joint round table discussion of the European Society for Surgical Research (ESSR) and the International Society for Experimental Microsurgery (ISEM), at the 50th Golden Anniversary Congress of the ESSR in Liverpool, aimed to issue a focused document on microsurgical research and training as a recommendation of these two societies. These endeavors are embodied in the present work, which offers an overview of state-of-the-art microsurgical training. Basic principles of experimental research and microsurgery are summarized. Current microsurgical training models and training concepts as well as issues concerning quality assessment and the translation of skills to clinical practice are discussed.

Basic Principles Driving Experimental Microsurgery and Training

Over the past decades, the following 3 basic principles have been developed: 3Rs (replacement, reduction, refinement), GLP (good laboratory practice), and 3Cs (curriculum, competence, clinical performance). The implementation of these has substantially contributed to modern research and training practice in experimental and clinical microsurgery.

3Rs

A driving force behind many developments in preclinical research during the late 20th century was the principle of the 3Rs, postulated by Russell and Burch [2] in 1959. Their claims provided the basis for humane procedures in experimental biology and have directed new laws and regulations in many countries around the world. In most western nations, scientists are required by law to work along its lines. [3] The 3Rs stand for the well-known replacement, reduction, and refinement. Some authors have discussed the improved concept of the 4Rs which implements the rehabilitation of animals (mostly large animals) used in experimental research [4].

In the case of experimental microsurgery, full replacement of animals is implausible; hence, the importance of measures to refine techniques and reduce the number of animals needed is obvious. High-quality, standardized training will also contribute to improved data quality in preclinical research. Compliance to social and biological requirements of animals might currently represent a challenge to some institutions; however, going forward, infrastructural, educational, and organizational planning should aim at meeting these standards.

Good Laboratory Practice

In the 1970s, the US authorities observed several deficiencies in planning, performing, documenting, and reporting of preclinical studies [5]. The political answer to these scandalous cases was the implementation of the GLP guidelines. GLP principles were initially developed for experimental studies in the field of substance toxicity, pharmacodynamics, and kinetics. However, the widespread acceptance of these guidelines also contributed to general improvements in laboratory standards and, therefore, in microsurgical research and training as well. These developments resulted in a claim for international standards, monitoring, and quality assurance.

3Cs

The classical “learning by doing” approach is still widely used. However, it is considered inappropriate for microsurgical training due to the caused frustration, the high number of animals used, and valuable time spent. The basics of modern competency-based microsurgical education can also be summarized in the principle of the 3Cs, suggested for the first time by Kobayashi [6] in 2015 (originally published by Kobayashi and Lefor in Japanese). The 3Cs stand for curriculum, competence, and clinical performance.

Curriculum: In the microsurgical learning process a well-designed curriculum is essential. This usually consists of several different training models and approaches (see chapter 3).

Competence: Learning a microsurgical procedure has several components. Competency-based learning is learner centered [7]. Competency-based learning allows the trainees to learn procedures at their individual pace, targeting predefined goals. In this phase the introduction of objective assessment tools is also essential to assess the different needs of trainees.

Clinical performance: This term was defined for clinical microsurgeons. For this group of trainees, the clinical translation of time spent and skills obtained in the training laboratory is highly relevant. However, it can also be adapted for biomedical researchers, where performance is manifested in data quality, the reduced number of animals needed, and improved animal welfare within the frameworks of research projects. Performance in research or in the clinical setting is one of the most important factors when evaluating the reason of existence of our microsurgical training programs.

All microsurgical training programs should follow these basic principles of animal welfare, laboratory research practice, and education.

Considerations on Microsurgical Training Concepts

Typing the keywords “microsurgical training” and “microsurgical models” into the search engine of the MEDLINE database will result in several hundreds of hits, showing the high number of training approaches which have been established for different levels of microsurgical training. In this section of our article we give an overview of the prerequisites in organized institutional microsurgical training as well as a summary of the in silico, ex vivo, and in vivo models used in the different microsurgical education concepts.

Thoughts on Instrumental Background

Regarding the microsurgical instruments, we consider 3 issues to be of importance: quality, standardization, and maintenance.

We recommend the use of high-quality instruments from the very beginning of training (Table 1). The minimum equipment required to teach different levels of microvascular surgery to a trainee are shown. Although high-quality instruments are essential for microsurgery, in many laboratories beginner trainees often use damaged or low-quality “training” instruments for practicing. Inadequate equipment can force candidates to compensate the deficit within the inferior instruments with unnecessary, awkward movements. This strategy can be very detrimental as it may cause unnecessary frustration due to a high failure rate and promote the acquisition of incorrect handling techniques. Once learned, incorrect techniques are difficult to “reprogram” in a later phase of training, and will hinder the development of delicate and reliable microsurgical skills.

Table 1.

Instrumental prerequisites of microsurgical training

/WebMaterial/ShowPic/864832

We recommend the use of a standardized set of instruments for training. However, there are also arguments for a diverse training setup. Many centers operate with similar “workstations” with standard microscopes and instrument setups for all candidates to avoid any differences in training due to instrumental reasons. However, a rotation of participants between workstations to try various types of microscopes from the simple tabletop settings to the complex clinically used standing microscopes, as well as the testing of less frequently used instruments (e.g., various forceps with or without platforms, bayonet instruments, needle holders with or without catch) could help in developing more confidence in instrument handling [8]. Further describing the basic instrumental needs for microsurgical training are beyond the scope of this review and can be found in previous publications [9, 10].

We also recommend the inclusion of instrument care in the training curriculum. Educating the trainee on the proper care, handling, and cleaning of these delicate instruments must also be part of the pretraining phase in order to reduce the risk of damages and costs [11].

Supervision and Trainee/Tutor Ratio

Microsurgical training requires intense supervision. In general, individual supervision is highly recommended during microsurgical training with the lowest possible trainee/tutor ratio [12-14], while “mass training” should be avoided [13].

The number of trainees per supervisor depends on 2 factors: quality of training curriculum and didactic expertise of the tutors. The highest acceptable trainee/tutor ratio is very much dependent on the training curriculum. In the case of a well-designed training with an established step-by-step protocol and strong supporting material for learning outside the laboratory (e.g., training videos, e-learning, illustrated course booklets), a higher ratio can be tolerated.

Supervisors should have considerable teaching experience but also undergo didactic training. Tutors for microsurgical training should have several years of experience in the field in order to have developed an appropriate sense for recognizing problematic steps and help candidates to keep their motivation and overcome difficulties. Accordingly, the significance of appropriate and continuous training of the tutor to obtain and maintain the aforementioned skills is also essential, but often underestimated.

Teaching Concepts

Surgical education was long based on the apprenticeship model. In this model training is based on the subjective opinion of senior faculty members. The length of the training is not strictly defined, but is rather dependent on the randomly occurring suitable learning situation, the subjective assessment of skill level, and skill acquisition through the senior faculty [15]. Increasing hospital costs, patient safety concerns, and structural changes in education made the reappraisal of this training concept necessary [16, 17].

Consequently, alternative training methods were developed to attain the required dexterity skills outside the clinical situation. In the beginning, classical ex vivo models were used. Nowadays, computer technology, virtual reality, and the development of microsurgical simulators using real surgical instruments and visual – as well as haptic – feedback (in silico training) offer valuable alternatives besides classical ex vivo models [18-20]. However, the exact role of this training modality in routine surgical education is still uncertain.

The following basic sequence gains increasing acceptance: acquisition of theoretical knowledge, acquisition of basic skills using surrogate plastic and biomaterials and, as the last step, training of complex procedures using living animals. Over the past years, several comprehensive studies have been published on novel strategies for reducing the use of animals and training candidates on theoretical and practical levels without living subjects [21-23].

Before – or parallel with – training in the laboratory, the candidate has to develop a solid theoretical knowledge about microsurgical techniques [24]. Besides surgical aspects, theoretical training must integrate the basics of veterinary anesthesia, analgesia and pain recognition, animal welfare, and basic understanding of animal behavior and handling.

This can be achieved the conventional way, by lectures, providing textbooks, comprehensive reviews, and course booklets or other kind of course material. To reduce time and high costs of teaching in the form of seminars and attendance blocks, we advocate the use of a blended learning concept, incorporating e-learning. Establishing e-learning materials not only contributes to easier and better standardization on an international and national level, but, in the long term, it can radically reduce costs for candidates (travel and accommodation) and training centers (providing lecturers, venue) as well. Blended learning can also help to avoid fatigue and a drop in cognitive capacities during long attendance blocks. Messaoudi et al. [25] has published their experience in 2015 with the first e-learning program for microsurgery in France. They showed a high satisfaction, especially among students residing far from their training centers. Today, free-of-charge video platforms are widely available and even scientific video journals are gaining ground; therefore, the role of quality and didactic training videos in surgical training and in its international standardization is highly emphasized [23, 26, 27].

Basic practical microsurgical training should start using nonliving models for ex vivo training. The selection of nonliving models spans from the very simple – but not very realis tic – rubber glove or rubber membrane suturing to the sophisticated computer simulators, but also includes exercises such as anastomoses on biomaterials [18, 21, 28-30]. A major aim of this training phase is to teach depth and hand-eye coordination, and gentle handling of the instruments and microscope. Furthermore, the candidates have to develop the appropriate feeling of movements under the microscope, and to understand their personal ergonomics (the basic body and hand position to avoid tremor and fatigue), without sacrificing living animals. In this stage it is recommended to break down all movements to very simple elements (e.g., analyzing and practicing all single steps of taking a bite or tying a knot).

Establishing an institutional biobank or organ sharing system may further reduce the use of living animals for training purposes. In most research facilities, due to specific research projects, numerous undamaged animal organs are discarded daily. These could be retrieved and used for educational purposes (e.g., rat cryopreserved vessel samples, pig hearts with intact coronary arteries).

In the modern multistep approach, trainees can usually start in vivo training after obtaining the basic knowledge and skills [31]. The use of in vivo models has been explored in detail, resulting in vast literature introducing different techniques to improve training protocols and reduce the use of living subjects [32, 33]. In vivo training has one key advantage: the experience of living tissue properties where mishandling and technical mistakes will lead to bleeding, thrombosis, or other complications. However, working with living animals might also be very stressful for novices. Large vessels might be damaged even before performing microvascular anastomosis, and even worse, animals can unexpectedly die due to bleeding or also during long anesthesia.

Training on living animals requires a strict ethical assessment process and may require additional qualification (FELASA B or similar) even within the framework of a course [22]. Several systematic reviews are available on this topic, dealing with different microsurgical training elements and selectively testing the validity of these models [1, 16, 34, 35].

Training Programs and Issues of Standardization Worldwide

The lack of standards regarding the structure and content of microsurgical courses available worldwide has led to enormous heterogeneity. This was concluded by several groups after performing analyses on microsurgical courses on national or international levels [36-42]. Similarly, it has already been declared by several groups that an ideal training tool or concept does not exist.

Theoretically, the ideal training should reproduce the preclinical or clinical situation in a realistic way. It must be generally recognized, reproducible, and free of ethical issues and any risk of hazards [34]. Furthermore, it has to be cost-efficient and require low-maintenance efforts. Although such an “ideal tool” does not exist, a theoretically ideal training might be approximated by means of well-designed concepts, individually customized for training groups, levels, and purposes. Certain diversity is necessary to reach the specific predefined goals. The choice concerning the use of well-established traditional training models should stay in the hands of the training centers. Despite leaving this freedom to the training centers, there is an increasing need for standardization of skill assessment in surgical training [36].

In 2013, the ISEM performed a survey-based analysis within its international network (Nemeth et al., unpubl. data). A query was designed and sent out electronically to the ISEM centers in Brazil, Canada, PR China, Germany, Hungary, Italy, Japan, South Korea, Romania, Sweden, Spain, Taiwan, and the USA. Participants were asked to provide a general description of the course (size, frequency, duration) as well as its academic and professional credentials. In addition, information on course funding was requested.

The survey resulted in a 90% response rate. A total of 12/13 centers completed the questionnaire. This study revealed a substantial heterogeneity regarding program, frequency, and time frame of the courses offered by the different institutions. Course duration was between 24 and 40 h with a participant/tutor ratio varying between 1:1 and 5:1. The curriculum of most programs included classical elements of basic microsurgical techniques, e.g., proper handling of microsurgical instruments, atraumatic tissue preparation, suture techniques on various models and biological surrogate materials (e.g., pig foot, chicken wing), end-to-end vascular anastomosis on isolated vessels and/or anesthetized rat abdominal aorta or femoral artery, and epi/perineural suturing. However, few departments (mostly those offering a maximum of 2–4 courses per year) considered personalized, custom-tailored programs as their most important merit.

Most programs considered it important to keep the participant/tutor ratio at a low level. The key purpose was to ensure sufficient individual training time with intense supervision. However, our survey revealed a considerable diversity in the courses offered.

Funding for the courses was mostly ensured by the participation fee. However, based on our findings, it can be concluded that purely from an economical point of view these courses could not survive. Most of the organizing groups mentioned the support of the host institutions as a very important prerequisite. Concretely, institutional support consisted in providing venues and/or equipment (e.g., microscopes, instruments) or other type of supports according to departmental/university strategies. The added value, vocation and attitude of the tutors, prestige of the department and university, and special skills obtained altogether result in a quantitatively hardly measurable value that makes these courses engaging and keeps them alive. Although this survey was only a cross-section of our training for the given time within our peers in the ISEM, the abovementioned findings correlate well with the international literature [36-42].

Assessment Tools and Clinical Translation of Microsurgical Skills

Microsurgery training is an essential part of surgical training programs for numerous specialties (e.g., hand surgeons, orthopedics, plastic surgeons, ENT specialists, ophthalmologists, experimental researchers) [32, 36]. The most important question is whether or not previous training on prosthetic models or living animals can really improve clinical surgical skills. In other words, how can we make sure that the predictive validity [34] (relationship between the performance on models and the clinical microsurgical skills) of the used training concept is satisfactory?

Most of the microsurgical programs do not evaluate the achieved competence, but confirm participation. Leung et al. [36] reviewed 39 different microsurgical courses offered worldwide. They reported that 75% (18/24) of the basic microsurgical courses did not include any evaluation method to test surgical performance and improvement. Only 1 center used a formal validation method (global rating scale, GRS) and calculated a final grade for the trainees.

Recent reports regarding the gain in competence after participating in a microsurgical class revealed controversial results. Studinger et al. [43] evaluated whether residents and fellows who completed various microsurgical courses were able to perform simple end-to-end anastomoses of rat femoral artery and vein. Interestingly, a high rate of failure was observed. Only 64% of participants could perform a patent arterial or venous anastomosis. The authors concluded that the duration of the previously attended training courses had no significant effect on performance and was not a predictive value for successfully performing microsurgery [43]. Nevertheless, this was not confirmed in a recent study of Christensen et al. [44] analyzing data from a larger group consisting of 61 participants of laboratory microsurgical training. According to their experience, patency rates in training and days spent in laboratory were fair predictors of later clinical outcome of free flap and replant surgeries. In a further work, Atkins et al. [45] investigated the effects of a 5-day microsurgical skills course in a group of 30 surgeons. Among these candidates, 60% (18/30) managed to improve their performance, and 10% (3/30) maintained their skill levels. However, 30% (9/30) even decreased their score regarding outcomes in a conventional rat femoral artery end-to-end anastomosis exercise.

These findings raise questions regarding the long-term clinical translatability of skills (learned via sacrificing living animals) and urge us to improve the competency criteria of our training methods and develop a way to assess our success or failure as a training center.

Various assessment tools – most of which were developed for general surgery training – were modified to adapt to the special demands of microsurgical training [1, 35, 46].

Elementary Assessment Tools

Elementary assessment tools are based on (external or self) observation and on the evaluation of the process and result quality.

Process Quality

The most frequently used qualitative tool to assess process quality is the direct observation by senior surgeons [36]. However, this approach has a poor validity; it is merely subjective and assessor dependent as well as time consuming [46]. Satterwhite et al. [47] used a self-evaluation tool, where trainees had to assess their own confidence in performing microsurgical anastomoses.

Result Quality

Assessing the macroscopic appearance of the anastomosis after transection or use of other visualization methods (photos, angiography, electron microscopy) has been mentioned in various reports [21, 48]. Further elementary assessment tools such as vessel patency (using physical patency tests, e.g., “empty and refill,” transit time flow probes, dye injections, strip tests, angiographic methods), vessel bleeding, physiological function of the vessel following anastomosis (nor-adrenaline or potassium chloride-induced contraction) were reviewed by others [1]. Ghanem et al. [49] recently established and validated the anastomosis lapse index (ALI). ALI is a pool of 10 typical errors potentially leading to anastomosis failure based on the analyses of photos of longitudinally opened anastomoses. ALI showed significant differences between novice and expert surgeons and was also suitable for detecting the improvement of skills with time in a setting of a 5-day microsurgical course.

Objective Structured Practical Examination, Checklists, and Global Rating Scales

The objective structured practical examination (OSPE) is an assessment tool in which the components of a certain competence are tested using checklists at different stations. The concept of the OSPE was first described in 1975 by Harden et al. [50] from Dundee. Since then, the OSPE has revolutionized clinical and practical examination and was adopted worldwide for assessing skill acquisition [51].

In microsurgery, Grober et al. [52] used a detailed, dichotomous, and task-specific 30-item evaluation checklist, whereby 1 mark was awarded for each correctly performed step in the procedure in parallel with a GRS. Regehr et al. [53] demonstrated the superiority in reliability and validity of a global rating scale compared to a task-specific checklist in a general surgical bench-training situation. Moulton et al. [17] tested 2 different microsurgical training concepts using various assessment tools. They concluded that global ratings seemed to be a more accurate measure of surgical performance than checklists. Improvements in checklist scores did not seem to follow any consistent pattern.

The introduction of GRSs eliminated several drawbacks of the checklists. Checklists are breaking down a complex procedure to the most fundamental steps and assessing these in a rigid system. Surgeons usually proceed with a certain task in a synthesizing manner and cannot stick to the rigid construct of a checklist [1]. Furthermore, checklists usually do not have any internal weighting for the different steps of a procedure and only use a binary system (procedure performed or not, correctly or incorrectly) [1, 46, 54].

GRSs were initially developed for general surgical procedure but they were later adapted for microsurgery and modified by several authors [1, 35, 46, 55-57] (Table 2). These tools are making the parallel assessment of various details – believed to determine microsurgical outcome – possible (respect for tissue handling, time, flow of surgery, dexterity). A scale with well-defined categories is usually implemented for scoring [1]. Ezra et al. [58] slightly modified the GRS in a study recruiting ophthalmology residents (Table 2). Temple and Ross [59] introduced knot tying and anastomosis modules to create the UWOMSA (University of Western Ontario Microsurgical Skills Acquisition/Assessment Instrument). In their study, measures of criterion validity demonstrated strong agreement between UWOMSA and the GRS (Table 2). The SAMS (Structured Assessment of Microsurgery) tool, introduced by Chan et al. [60, 61], is a complex assessment approach. It consists of a GRS and further novel elements such as an error list, comments, overall performance scale, and an indicative skill for next performance scale (Table 2). The SAMS tool has the ability to measure skills on different axes; however, it is rather complex and time consuming. Recently, a refined GRS has been reported by Satterwhite et al. [62] (Table 2). They introduced an online curriculum for microsurgical training combined with different prosthetic and ex vivo training models (latex glove model, Penrose drain, dorsal vessel of a chicken foot). Trainees were assessed with the novel SMaRT (Stanford Microsurgery and Resident Training Scale) tool, which is a GRS with 9 categories (instrument handling, respect for tissue, efficiency, suture handling, suturing technique, quality of knot, final product, operation flow, overall performance) and a scale ranging between 1 and 5.

Table 2.

Global rating scale and its main modifications

/WebMaterial/ShowPic/864830

Technical Assessment Tools

Nowadays, a very convenient and easy-to-use technical tool to support the assessment of process and result quality is the use of digital video recording and imaging. These records can be analyzed to visualize learning curves, e.g., time, suture placement, handling, survival, and complication rates in more complex surgical models such as transplantation [23, 24, 63-65]. Within our group we used microscopic video recordings to evaluate the process of skill acquisition during the training of rat orthotopic liver transplantation [23]. Recordings were assessed by trainees to identify unnecessary movements or false techniques with the help of the supervisors. This guided analysis can and should represent the basis for trainees to understand their own errors and to develop strategies on how to prevent these errors. This can be extended by the application of the so-called PDCA cycles (plan – do – check the results – action). This instrument, which is utilized frequently in quality control and management, calls for the definition of a specific goal for a given procedure followed by the subsequent performance analysis. Each round of performance is assessed carefully in respect to the predefined goal and is intended to set the goal for the next training round [65]. Performed consequently, this approach can help to set defined technical goals for each repetition and limit the number of repetitions needed until a technique is mastered [65].

Video-based learning can be further supported by an automated analysis algorithm. Motion tracking, like other measures mentioned in the present section of our manuscript, became popular thanks to the attempts for the assessment of skill acquisition in general surgery [66, 67]. To date, there are a handful of studies available using motion-tracking devices also in microsurgical scenarios (Table 3) [17, 57, 58, 68-70]. According to the conclusions of these works, hand motion analysis might be a valuable and valid quantitative tool for the assessment of microsurgical skill acquisition. However, the major drawback of the complex system is the price of software and devices, especially if simultaneous assessment of several candidates is necessary.

Table 3.

Studies implementing motion analysis

/WebMaterial/ShowPic/864828

Conventional hand motion tracking is based on electromagnetic sensors placed on the hand of the surgeon. However, this is not the only way for motion analysis in microsurgical research. Shah et al. [71] have recently reported their preliminary experience with the use of UberSense, a free-of-charge smartphone application. This tool is a slow-motion video analysis application mostly used by athletes. The authors adopted this method for analysis of video recordings of microsurgical interventions performed by a trainee or by a consultant. They designated several target parameters, which can be quantitatively measured using UberSense (Table 3). The validity and usefulness of this interesting and cost-effective tool needs to be further assessed with higher sample sizes.

McGoldrick et al. [72] proposed a novel video instrument motion analysis tool to objectively measure movements during microsurgical tasks (Table 3). They report that motion analysis performance scores correlated well with the SMaRT scale scores. The authors cited the high initial costs (USD 5,000 for the software) as a disadvantage of this approach.

Harada et al. [73] published their experience with a novel complex motion analysis device consisting of infrared optical motion tracking, a Bluetooth-based inertial measurement unit, and strain gauges (the device was fixed to the end of a microforceps) (Table 3). With this method not only the hand movements and times can be analyzed, but the needle gripping force as well. They demonstrated a slightly lower needle gripping force applied by expert surgeons compared to less skilled trainees; however, the difference did not reach the level of significance.

Using a computer program and electronic shutter goggles, various authors attempted to demonstrate the role of visual spatial ability (ability of the brain to manipulate 2-dimensional and 3-dimensional images) in skill acquisition during complex surgical and microsurgical interventions [57, 74, 75]. The validity of stereoscopic visual acuity (SVA) was assessed by Grober et al [57]. The aim of these measurements was to determine the stereoscopic depth, which can just still be detected by the candidate. Performance of the candidate in the SVA test was compared with the result of GRS scores. The authors concluded that SVA did not correlate significantly with global rating scores. Horváth et al. [76] used PAM tremorometry developed for Parkinson diagnostics to analyze microsurgical skills with 2 different instrument handling approaches.

Concluding Remarks and Future Perspectives

Several new principles of experimental research have been introduced during the last decades (3Rs, GLP, 3Cs), representing major driving forces in developing microsurgical research and training. Taking into account that no perfect training models are available, microsurgical training always has to be a well-designed multistep program including different kinds of models from the very simple dry models to demanding living models.

The history of microsurgical training and the traditional aim for better standardization goes back several decades [33]. The increasing use of modern objective assessment tools (e.g., OSPE, check lists, GRSs, motion analysis) has emerged in the last few years.

The authors recommend the combined use of assessment tools and devices. In the beginning of training when the presence of an experienced tutor is highly emphasized [13], tools of direct observation by an expert surgeon, GRSs, and elementary assessment approaches such as patency tests can provide effective reliable feedback about our training. In more advanced stages (besides GRSs, learning curves, self-evaluation using videos or checklist), the introduction of more complex measures (e.g., hand and instrument motion analysis, needle grip force) should be considered. Other factors, such as manpower and resources of the laboratory, as well as comfort and expectations of the students, also need to be considered when selecting assessment tools for training courses.

Great pioneers of experimental microsurgical training, e.g., Sun Lee or Robert D. Acland, have not based their work on the use of sophisticated assessment tools. Their models and training concepts became legendary due to their extraordinary surgical skills and personalities as teachers and academics. This nonmeasurable “humane” aspect of training has to be appreciated in the era of objective skill assessment as well.

The survey study demonstrated in the present paper, and confirmed by others, showed a huge demand for microsurgical training opportunities. However, accessibility and “service” are very heterogeneous in different countries. In our opinion, for learning the basics of microsurgery, 1- or 2-day courses cannot be effective. These short courses are too long for an introduction but not satisfactory for an effective skill acquisition. Courses offering 40 h or more might be suitable for basic training. Since there is an increased globalization and mobility, people may search for the best service for their money within a widening geographical range, which frequently results in “course tourism.”

As with previous authors, we also suggest abandoning the obsolete concept of “certificate of attendance” at the end of the microsurgical training courses and to introduce a new “certificate of attendance and competence” [43]. This certificate must contain a grade of competence, assessed throughout the training using the aforementioned objective assessment tools.

An online register keeping all relevant courses and institutions, as well as microsurgeons and tutors, in one place would significantly increase transparency and standards in this field. It would also bring simplicity for institutions to promote their programs and for scientists to build their microsurgical network. Higher standards and better transparency in experimental microsurgery could also result in improved clinical outcomes as well as in better data quality in research. There is a strong need worldwide for better communication between working groups as well as for improvements in standardizing microsurgical research and training. We believe that active collaboration between our societies can provide a solid basis for these endeavors in the future.

Disclosure Statement

The authors of this paper have no conflicts of interest to disclose.


References

  1. Ramachandran S, Ghanem AM, Myers SR: Assessment of microsurgery competency – where are we now? Microsurgery 2013; 33:406–415.
  2. Russell WM, Burch RL: The Principles of Humane Experimental Technique. London, Methuen, 1959.
  3. Gross D, Tolba RH: Ethics in animal-based research. Eur Surg Res 2015; 55:43–57.
  4. Pereira S, Tettamanti M: Ahimsa and alternatives – the concept of the 4th R. The CPCSEA in India. ALTEX 2005; 22:3–6.
    External Resources
  5. Handbook: Good Laboratory Practice (GLP): Quality Practices for Regulated Non-Clinical Research and Development, ed 2. Switzerland, World Health Organization, 2009.
  6. Kobayashi E: Present and aspects for cadaver surgical training in Japan (in Japanese). Kyobu Geka 2015; 68:204–211.
    External Resources
  7. Knox AD, Gilardino MS, Kasten SJ, Warren RJ, Anastakis DJ: Competency-based medical education for plastic surgery: where do we begin? Plast Reconstr Surg 2014; 133:702e–710e.
  8. Oltean M, Sassu P, Hellstrom M, Axelsson P, Ewaldsson L, Nilsson AG, Axelsson M: The microsurgical training programme in Gothenburg, Sweden: early experiences. J Plast Surg Hand Surg 2016; 1–6.
  9. Hoyt RF Jr: Clevenger RR, McGehee JA: Microsurgical instrumentation and suture material. Lab Anim (NY) 2001; 30:38–45.
    External Resources
  10. MacDonald JD: Learning to perform microvascular anastomosis. Skull Base 2005; 15:229–240.
  11. VanderKam V: Care of microvascular surgical instruments. Plast Surg Nurs 1999; 19:31–34.
  12. Furka I, Brath E, Nemeth N, Miko I: Conceptions about microsurgical education. What were 5,460 hours of microsurgical basic education enough for? (in Hungarian). Magy Seb 2006; 59:147–151.
    External Resources
  13. Furka I, Brath E, Nemeth N, Miko I: Learning microsurgical suturing and knotting techniques: comparative data. Microsurgery 2006; 26:4–7.
  14. Miko I, Brath E, Furka I: Basic teaching in microsurgery. Microsurgery 2001; 21:121–123.
  15. Kerr B, O’Leary JP: The training of the surgeon: Dr. Halsted’s greatest legacy. Am Surg 1999; 65:1101–1102.
    External Resources
  16. Dumestre D, Yeung JK, Temple-Oberle C: Evidence-based microsurgical skill-acquisition series. Part 1. Validated microsurgical models – a systematic review. J Surg Educ 2014; 71:329–338.
  17. Moulton CA, Dubrowski A, Macrae H, Graham B, Grober E, Reznick R: Teaching surgical skills: what kind of practice makes perfect?: A randomized, controlled trial. Ann Surg 2006; 244:400–409.
  18. Ilie VG, Ilie VI, Dobreanu C, Ghetu N, Luchian S, Pieptu D: Training of microsurgical skills on nonliving models. Microsurgery 2008; 28:571–577.
  19. Meier AH, Rawn CL, Krummel TM: Virtual reality: surgical application – challenge for the new millennium. J Am Coll Surg 2001; 192:372–384.
  20. Erel E, Aiyenibe B, Butler PE: Microsurgery simulators in virtual reality: review. Microsurgery 2003; 23:147–152.
  21. Schoffl H, Froschauer SM, Dunst KM, Hager D, Kwasny O, Huemer GM: Strategies for the reduction of live animal use in microsurgical training and education. Altern Lab Anim 2008; 36:153–160.
    External Resources
  22. Singh M, Ziolkowski N, Ramachandran S, Myers SR, Ghanem AM: Development of a five-day basic microsurgery simulation training course: a cost analysis. Arch Plast Surg 2014; 41:213–217.
  23. Czigany Z, Iwasaki J, Yagi S, Nagai K, Szijarto A, Uemoto S, Tolba RH: Improving research practice in rat orthotopic and partial orthotopic liver transplantation: a review, recommendation, and publication guide. Eur Surg Res 2015; 55:119–138.
  24. Holzen JP, Palmes D, Langer M, Spiegel HU: Microsurgical training curriculum for learning kidney and liver transplantation in the rat. Microsurgery 2005; 25:614–623.
  25. Messaoudi T, Bodin F, Hidalgo Diaz JJ, Ichihara S, Fikry T, Lacreuse I, Liverneaux P, Facca S: Evaluation of a new e-learning platform for distance teaching of microsurgery. Chir Main 2015; 34:109–112.
  26. Nagai K, Yagi S, Uemoto S, Tolba RH: Surgical procedures for a rat model of partial orthotopic liver transplantation with hepatic arterial reconstruction. J Vis Exp 2013;e4376.
    External Resources
  27. Oldani G, Lacotte S, Morel P, Mentha G, Toso C: Orthotopic liver transplantation in rats. J Vis Exp 2012; 65:4143.
  28. Belykh E, Byvaltsev V: Off-the-job microsurgical training on dry models: Siberian experience. World Neurosurg 2014; 82:20–24.
  29. Lannon DA, Atkins JA, Butler PE: Non-vital, prosthetic, and virtual reality models of microsurgical training. Microsurgery 2001; 21:389–393.
    External Resources
  30. Kobayashi E, Haga J: Translational microsurgery. A new platform for transplantation research. Acta Cir Bras 2016; 31:212–217.
  31. Price J, Naik V, Boodhwani M, Brandys T, Hendry P, Lam BK: A randomized evaluation of simulation training on performance of vascular anastomosis on a high-fidelity in vivo model: The role of deliberate practice. J Thorac Cardiovasc Surg 2011; 142:496–503.
  32. Shurey S, Akelina Y, Legagneux J, Malzone G, Jiga L, Ghanem AM: The rat model in microsurgery education: classical exercises and new horizons. Arch Plast Surg 2014; 41:201–208.
  33. Di Cataldo A, La Greca G, Rodolico M, Candiano C, Li Destri G, Puleo S: Experimental models in microsurgery. Microsurgery 1998; 18:454–459.
  34. Chan WY, Matteucci P, Southern SJ: Validation of microsurgical models in microsurgery training and competence: a review. Microsurgery 2007; 27:494–499.
  35. Dumestre D, Yeung JK, Temple-Oberle C: Evidence-based microsurgical skills acquisition series. Part 2. Validated assessment instruments – a systematic review. J Surg Educ 2015; 72:80–89.
  36. Leung CC, Ghanem AM, Tos P, Ionac M, Froschauer S, Myers SR: Towards a global understanding and standardisation of education and training in microsurgery. Arch Plast Surg 2013; 40:304–311.
  37. Ghanem AM, Hachach-Haram N, Leung CC, Myers SR: A systematic review of evidence for education and training interventions in microsurgery. Arch Plast Surg 2013; 40:312–319.
  38. Kolbenschlag J, Gehl B, Daigeler A, Kremer T, Hirche C, Vogt PM, Horch R, Lehnhardt M, Kneser U: Microsurgical training in Germany – results of a survey among trainers and trainees (in German). Handchir Mikrochir Plast Chir 2014; 46:234–241.
  39. Al-Bustani S, Halvorson EG: Status of microsurgical simulation training in plastic surgery: a survey of United States program directors. Ann Plast Surg 2016; 76:713–716.
  40. Goossens DP, Gruel SM, Rao VK: A survey of microsurgery training in the United States. Microsurgery 1990; 11:2–4.
  41. Alzakri A, Al-Rajeh M, Liverneaux PA, Facca S: Courses in microsurgical techniques in France and abroad (in French). Chir Main 2014; 33:219–223.
  42. Di Cataldo A, Puleo S, Rodolico G: Three microsurgical courses in Catania. Microsurgery 1998; 18:449–453.
  43. Studinger RM, Bradford MM, Jackson IT: Microsurgical training: is it adequate for the operating room? Eur J Plast Surg 2005; 28:91–93.
  44. Christensen TJ, Anding W, Shin AY, Bishop AT, Moran SL: The influence of microsurgical training on the practice of hand surgeons. J Reconstr Microsurg 2015; 31:442–449.
  45. Atkins JL, Kalu PU, Lannon DA, Green CJ, Butler PE: Training in microsurgical skills: does course-based learning deliver? Microsurgery 2005; 25:481–485.
  46. Kalu PU, Atkins J, Baker D, Green CJ, Butler PE: How do we assess microsurgical skill? Microsurgery 2005; 25:25–29.
  47. Satterwhite T, Son J, Carey J, Zeidler K, Bari S, Gurtner G, Chang J, Lee GK: Microsurgery education in residency training: validating an online curriculum. Ann Plast Surg 2012; 68:410–414.
  48. Odobescu A, Moubayed SP, Harris PG, Bou-Merhi J, Daniels E, Danino MA: A new microsurgical research model using Thiel-embalmed arteries and comparison of two suture techniques. J Plast Reconstr Aesthet Surg 2014; 67:389–395.
  49. Ghanem AM, Al Omran Y, Shatta B, Kim E, Myers S: Anastomosis lapse index (ALI): a validated end product assessment tool for simulation microsurgery training. J Reconstr Microsurg 2016; 32:233–241.
  50. Harden RM, Stevenson M, Downie WW, Wilson GM: Assessment of clinical competence using objective structured examination. Br Med J 1975; 1:447–451.
    External Resources
  51. Ananthakrishnan N: Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med 1993; 39:82–84.
    External Resources
  52. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, Jarvi KA: The educational impact of bench model fidelity on the acquisition of technical skill: the use of clinically relevant outcome measures. Ann Surg 2004; 240:374–381.
  53. Regehr G, MacRae H, Reznick RK, Szalay D: Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med 1998; 73:993–997.
  54. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M: Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997; 84:273–278.
  55. Schueneman AL, Pickleman J, Hesslein R, Freeark RJ: Neuropsychologic predictors of operative skill among general surgery residents. Surgery 1984; 96:288–295.
    External Resources
  56. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, Jarvi KA: Laboratory based training in urological microsurgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol 2004; 172:378–381.
  57. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, Jarvi KA: Validation of novel and objective measures of microsurgical skill: hand-motion analysis and stereoscopic visual acuity. Microsurgery 2003; 23:317–322.
  58. Ezra DG, Aggarwal R, Michaelides M, Okhravi N, Verma S, Benjamin L, Bloom P, Darzi A, Sullivan P: Skills acquisition and assessment after a microsurgical skills course for ophthalmology residents. Ophthalmology 2009; 116:257–262.
  59. Temple CL, Ross DC: A new, validated instrument to evaluate competency in microsurgery: the University of Western Ontario Microsurgical Skills Acquisition/Assessment instrument (outcomes article). Plast Reconstr Surg 2011; 127:215–222.
  60. Chan W, Niranjan N, Ramakrishnan V: Structured assessment of microsurgery skills in the clinical setting. J Plast Reconstr Aesthet Surg 2010; 63:1329–1334.
  61. Selber JC, Chang EI, Liu J, Suami H, Adelman DM, Garvey P, Hanasono MM, Butler CE: Tracking the learning curve in microsurgical skill acquisition. Plast Reconstr Surg 2012; 130:550e–557e.
  62. Satterwhite T, Son J, Carey J, Echo A, Spurling T, Paro J, Gurtner G, Chang J, Lee GK: The Stanford microsurgery and resident training (SMaRT) scale: validation of an on-line global rating scale for technical assessment. Ann Plast Surg 2014; 72(suppl 1):S84–S88.
  63. Hori T, Nguyen JH, Zhao X, Ogura Y, Hata T, Yagi S, Chen F, Baine AM, Ohashi N, Eckman CB, Herdt AR, Egawa H, Takada Y, Oike F, Sakamoto S, Kasahara M, Ogawa K, Hata K, Iida T, Yonekawa Y, Sibulesky L, Kuribayashi K, Kato T, Saito K, Wang L, Torii M, Sahara N, Kamo N, Sahara T, Yasutomi M, Uemoto S: Comprehensive and innovative techniques for liver transplantation in rats: a surgical guide. World J Gastroenterol 2010; 16:3120–3132.
  64. Starkes JL, Payk I, Hodges NJ: Developing a standardized test for the assessment of suturing skill in novice microsurgeons. Microsurgery 1998; 18:19–22.
  65. Jin H, Huang H, Dong W, Sun J, Liu A, Deng M, Dirsch O, Dahmen U: Preliminary experience of a PDCA-cycle and quality management based training curriculum for rat liver transplantation. J Surg Res 2012; 176:409–422.
  66. Reiley CE, Lin HC, Yuh DD, Hager GD: Review of methods for objective surgical skill evaluation. Surg Endosc 2011; 25:356–366.
  67. Van Nortwick SS, Lendvay TS, Jensen AR, Wright AS, Horvath KD, Kim S: Methodologies for establishing validity in surgical simulation studies. Surgery 2010; 147:622–630.
  68. Saleh GM, Voyatzis G, Hance J, Ratnasothy J, Darzi A: Evaluating surgical dexterity during corneal suturing. Arch Ophthalmol 2006; 124:1263–1266.
  69. Saleh GM, Gauba V, Sim D, Lindfield D, Borhani M, Ghoussayni S: Motion analysis as a tool for the evaluation of oculoplastic surgical skill: evaluation of oculoplastic surgical skill. Arch Ophthalmol 2008; 126:213–216.
  70. Grober ED, Roberts M, Shin EJ, Mahdi M, Bacal V: Intraoperative assessment of technical skills on live patients using economy of hand motion: establishing learning curves of surgical competence. Am J Surg 2010; 199:81–85.
  71. Shah A, Rowlands M, Patel A, Fusi S, Salomon J: Ubersense: using a free video analysis app to evaluate and improve microsurgical skills. Plast Reconstr Surg 2014; 134:338e–339e.
  72. McGoldrick RB, Davis CR, Paro J, Hui K, Nguyen D, Lee GK: Motion analysis for microsurgical training: objective measures of dexterity, economy of movement, and ability. Plast Reconstr Surg 2015; 136:231e–240e.
  73. Harada K, Morita A, Minakawa Y, Baek YM, Sora S, Sugita N, Kimura T, Tanikawa R, Ishikawa T, Mitsuishi M: Assessing microneurosurgical skill with medico-engineering technology. World Neurosurg 2015; 84:964–971.
  74. Wanzel KR, Hamstra SJ, Anastakis DJ, Matsumoto ED, Cusimano MD: Effect of visual-spatial ability on learning of spatially-complex surgical skills. Lancet 2002; 359:230–231.
  75. Murdoch JR, Bainbridge LC, Fisher SG, Webster MH: Can a simple test of visual-motor skill predict the performance of microsurgeons? J R Coll Surg Edinb 1994; 39:150–152.
    External Resources
  76. Horváth A, Valalik I, Csokay A: Suture of minimal-diameter vessels using fingertip support technique. J Hand Microsurg 2013; 5:44–45.

Author Contacts

René H. Tolba, MD, PhD

Institute for Laboratory Animal Science and Experimental Surgery, RWTH Aachen University, Pauwelsstrasse 30

DE–52074 Aachen (Germany)

E-Mail rtolba@ukaachen.de

Norbert Nemeth, MD, PhD

Department of Operative Techniques and Surgical Research, Faculty of Medicine, University of Debrecen, Moricz Zs. krt. 22

HU–4032 Debrecen (Hungary)

E-Mail nemeth@med.unideb.hu


Article / Publication Details

First-Page Preview
Abstract of Invited Review

Received: May 29, 2017
Accepted: June 27, 2017
Published online: July 26, 2017
Issue release date: Published online first (Issue-in-Progress)

Number of Print Pages: 17
Number of Figures: 0
Number of Tables: 3

ISSN: 0014-312X (Print)
eISSN: 1421-9921 (Online)

For additional information: http://www.karger.com/ESR


Copyright / Drug Dosage / Disclaimer

Copyright: All rights reserved. No part of this publication may be translated into other languages, reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying, recording, microcopying, or by any information storage and retrieval system, without permission in writing from the publisher.
Drug Dosage: The authors and the publisher have exerted every effort to ensure that drug selection and dosage set forth in this text are in accord with current recommendations and practice at the time of publication. However, in view of ongoing research, changes in government regulations, and the constant flow of information relating to drug therapy and drug reactions, the reader is urged to check the package insert for each drug for any changes in indications and dosage and for added warnings and precautions. This is particularly important when the recommended agent is a new and/or infrequently employed drug.
Disclaimer: The statements, opinions and data contained in this publication are solely those of the individual authors and contributors and not of the publishers and the editor(s). The appearance of advertisements or/and product references in the publication is not a warranty, endorsement, or approval of the products or services advertised or of their effectiveness, quality or safety. The publisher and the editor(s) disclaim responsibility for any injury to persons or property resulting from any ideas, methods, instructions or products referred to in the content or advertisements.

References

  1. Ramachandran S, Ghanem AM, Myers SR: Assessment of microsurgery competency – where are we now? Microsurgery 2013; 33:406–415.
  2. Russell WM, Burch RL: The Principles of Humane Experimental Technique. London, Methuen, 1959.
  3. Gross D, Tolba RH: Ethics in animal-based research. Eur Surg Res 2015; 55:43–57.
  4. Pereira S, Tettamanti M: Ahimsa and alternatives – the concept of the 4th R. The CPCSEA in India. ALTEX 2005; 22:3–6.
    External Resources
  5. Handbook: Good Laboratory Practice (GLP): Quality Practices for Regulated Non-Clinical Research and Development, ed 2. Switzerland, World Health Organization, 2009.
  6. Kobayashi E: Present and aspects for cadaver surgical training in Japan (in Japanese). Kyobu Geka 2015; 68:204–211.
    External Resources
  7. Knox AD, Gilardino MS, Kasten SJ, Warren RJ, Anastakis DJ: Competency-based medical education for plastic surgery: where do we begin? Plast Reconstr Surg 2014; 133:702e–710e.
  8. Oltean M, Sassu P, Hellstrom M, Axelsson P, Ewaldsson L, Nilsson AG, Axelsson M: The microsurgical training programme in Gothenburg, Sweden: early experiences. J Plast Surg Hand Surg 2016; 1–6.
  9. Hoyt RF Jr: Clevenger RR, McGehee JA: Microsurgical instrumentation and suture material. Lab Anim (NY) 2001; 30:38–45.
    External Resources
  10. MacDonald JD: Learning to perform microvascular anastomosis. Skull Base 2005; 15:229–240.
  11. VanderKam V: Care of microvascular surgical instruments. Plast Surg Nurs 1999; 19:31–34.
  12. Furka I, Brath E, Nemeth N, Miko I: Conceptions about microsurgical education. What were 5,460 hours of microsurgical basic education enough for? (in Hungarian). Magy Seb 2006; 59:147–151.
    External Resources
  13. Furka I, Brath E, Nemeth N, Miko I: Learning microsurgical suturing and knotting techniques: comparative data. Microsurgery 2006; 26:4–7.
  14. Miko I, Brath E, Furka I: Basic teaching in microsurgery. Microsurgery 2001; 21:121–123.
  15. Kerr B, O’Leary JP: The training of the surgeon: Dr. Halsted’s greatest legacy. Am Surg 1999; 65:1101–1102.
    External Resources
  16. Dumestre D, Yeung JK, Temple-Oberle C: Evidence-based microsurgical skill-acquisition series. Part 1. Validated microsurgical models – a systematic review. J Surg Educ 2014; 71:329–338.
  17. Moulton CA, Dubrowski A, Macrae H, Graham B, Grober E, Reznick R: Teaching surgical skills: what kind of practice makes perfect?: A randomized, controlled trial. Ann Surg 2006; 244:400–409.
  18. Ilie VG, Ilie VI, Dobreanu C, Ghetu N, Luchian S, Pieptu D: Training of microsurgical skills on nonliving models. Microsurgery 2008; 28:571–577.
  19. Meier AH, Rawn CL, Krummel TM: Virtual reality: surgical application – challenge for the new millennium. J Am Coll Surg 2001; 192:372–384.
  20. Erel E, Aiyenibe B, Butler PE: Microsurgery simulators in virtual reality: review. Microsurgery 2003; 23:147–152.
  21. Schoffl H, Froschauer SM, Dunst KM, Hager D, Kwasny O, Huemer GM: Strategies for the reduction of live animal use in microsurgical training and education. Altern Lab Anim 2008; 36:153–160.
    External Resources
  22. Singh M, Ziolkowski N, Ramachandran S, Myers SR, Ghanem AM: Development of a five-day basic microsurgery simulation training course: a cost analysis. Arch Plast Surg 2014; 41:213–217.
  23. Czigany Z, Iwasaki J, Yagi S, Nagai K, Szijarto A, Uemoto S, Tolba RH: Improving research practice in rat orthotopic and partial orthotopic liver transplantation: a review, recommendation, and publication guide. Eur Surg Res 2015; 55:119–138.
  24. Holzen JP, Palmes D, Langer M, Spiegel HU: Microsurgical training curriculum for learning kidney and liver transplantation in the rat. Microsurgery 2005; 25:614–623.
  25. Messaoudi T, Bodin F, Hidalgo Diaz JJ, Ichihara S, Fikry T, Lacreuse I, Liverneaux P, Facca S: Evaluation of a new e-learning platform for distance teaching of microsurgery. Chir Main 2015; 34:109–112.
  26. Nagai K, Yagi S, Uemoto S, Tolba RH: Surgical procedures for a rat model of partial orthotopic liver transplantation with hepatic arterial reconstruction. J Vis Exp 2013;e4376.
    External Resources
  27. Oldani G, Lacotte S, Morel P, Mentha G, Toso C: Orthotopic liver transplantation in rats. J Vis Exp 2012; 65:4143.
  28. Belykh E, Byvaltsev V: Off-the-job microsurgical training on dry models: Siberian experience. World Neurosurg 2014; 82:20–24.
  29. Lannon DA, Atkins JA, Butler PE: Non-vital, prosthetic, and virtual reality models of microsurgical training. Microsurgery 2001; 21:389–393.
    External Resources
  30. Kobayashi E, Haga J: Translational microsurgery. A new platform for transplantation research. Acta Cir Bras 2016; 31:212–217.
  31. Price J, Naik V, Boodhwani M, Brandys T, Hendry P, Lam BK: A randomized evaluation of simulation training on performance of vascular anastomosis on a high-fidelity in vivo model: The role of deliberate practice. J Thorac Cardiovasc Surg 2011; 142:496–503.
  32. Shurey S, Akelina Y, Legagneux J, Malzone G, Jiga L, Ghanem AM: The rat model in microsurgery education: classical exercises and new horizons. Arch Plast Surg 2014; 41:201–208.
  33. Di Cataldo A, La Greca G, Rodolico M, Candiano C, Li Destri G, Puleo S: Experimental models in microsurgery. Microsurgery 1998; 18:454–459.
  34. Chan WY, Matteucci P, Southern SJ: Validation of microsurgical models in microsurgery training and competence: a review. Microsurgery 2007; 27:494–499.
  35. Dumestre D, Yeung JK, Temple-Oberle C: Evidence-based microsurgical skills acquisition series. Part 2. Validated assessment instruments – a systematic review. J Surg Educ 2015; 72:80–89.
  36. Leung CC, Ghanem AM, Tos P, Ionac M, Froschauer S, Myers SR: Towards a global understanding and standardisation of education and training in microsurgery. Arch Plast Surg 2013; 40:304–311.
  37. Ghanem AM, Hachach-Haram N, Leung CC, Myers SR: A systematic review of evidence for education and training interventions in microsurgery. Arch Plast Surg 2013; 40:312–319.
  38. Kolbenschlag J, Gehl B, Daigeler A, Kremer T, Hirche C, Vogt PM, Horch R, Lehnhardt M, Kneser U: Microsurgical training in Germany – results of a survey among trainers and trainees (in German). Handchir Mikrochir Plast Chir 2014; 46:234–241.
  39. Al-Bustani S, Halvorson EG: Status of microsurgical simulation training in plastic surgery: a survey of United States program directors. Ann Plast Surg 2016; 76:713–716.
  40. Goossens DP, Gruel SM, Rao VK: A survey of microsurgery training in the United States. Microsurgery 1990; 11:2–4.
  41. Alzakri A, Al-Rajeh M, Liverneaux PA, Facca S: Courses in microsurgical techniques in France and abroad (in French). Chir Main 2014; 33:219–223.
  42. Di Cataldo A, Puleo S, Rodolico G: Three microsurgical courses in Catania. Microsurgery 1998; 18:449–453.
  43. Studinger RM, Bradford MM, Jackson IT: Microsurgical training: is it adequate for the operating room? Eur J Plast Surg 2005; 28:91–93.
  44. Christensen TJ, Anding W, Shin AY, Bishop AT, Moran SL: The influence of microsurgical training on the practice of hand surgeons. J Reconstr Microsurg 2015; 31:442–449.
  45. Atkins JL, Kalu PU, Lannon DA, Green CJ, Butler PE: Training in microsurgical skills: does course-based learning deliver? Microsurgery 2005; 25:481–485.
  46. Kalu PU, Atkins J, Baker D, Green CJ, Butler PE: How do we assess microsurgical skill? Microsurgery 2005; 25:25–29.
  47. Satterwhite T, Son J, Carey J, Zeidler K, Bari S, Gurtner G, Chang J, Lee GK: Microsurgery education in residency training: validating an online curriculum. Ann Plast Surg 2012; 68:410–414.
  48. Odobescu A, Moubayed SP, Harris PG, Bou-Merhi J, Daniels E, Danino MA: A new microsurgical research model using Thiel-embalmed arteries and comparison of two suture techniques. J Plast Reconstr Aesthet Surg 2014; 67:389–395.
  49. Ghanem AM, Al Omran Y, Shatta B, Kim E, Myers S: Anastomosis lapse index (ALI): a validated end product assessment tool for simulation microsurgery training. J Reconstr Microsurg 2016; 32:233–241.
  50. Harden RM, Stevenson M, Downie WW, Wilson GM: Assessment of clinical competence using objective structured examination. Br Med J 1975; 1:447–451.
    External Resources
  51. Ananthakrishnan N: Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med 1993; 39:82–84.
    External Resources
  52. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, Jarvi KA: The educational impact of bench model fidelity on the acquisition of technical skill: the use of clinically relevant outcome measures. Ann Surg 2004; 240:374–381.
  53. Regehr G, MacRae H, Reznick RK, Szalay D: Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med 1998; 73:993–997.
  54. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M: Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997; 84:273–278.
  55. Schueneman AL, Pickleman J, Hesslein R, Freeark RJ: Neuropsychologic predictors of operative skill among general surgery residents. Surgery 1984; 96:288–295.
    External Resources
  56. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, Jarvi KA: Laboratory based training in urological microsurgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol 2004; 172:378–381.
  57. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, Jarvi KA: Validation of novel and objective measures of microsurgical skill: hand-motion analysis and stereoscopic visual acuity. Microsurgery 2003; 23:317–322.
  58. Ezra DG, Aggarwal R, Michaelides M, Okhravi N, Verma S, Benjamin L, Bloom P, Darzi A, Sullivan P: Skills acquisition and assessment after a microsurgical skills course for ophthalmology residents. Ophthalmology 2009; 116:257–262.
  59. Temple CL, Ross DC: A new, validated instrument to evaluate competency in microsurgery: the University of Western Ontario Microsurgical Skills Acquisition/Assessment instrument (outcomes article). Plast Reconstr Surg 2011; 127:215–222.
  60. Chan W, Niranjan N, Ramakrishnan V: Structured assessment of microsurgery skills in the clinical setting. J Plast Reconstr Aesthet Surg 2010; 63:1329–1334.
  61. Selber JC, Chang EI, Liu J, Suami H, Adelman DM, Garvey P, Hanasono MM, Butler CE: Tracking the learning curve in microsurgical skill acquisition. Plast Reconstr Surg 2012; 130:550e–557e.
  62. Satterwhite T, Son J, Carey J, Echo A, Spurling T, Paro J, Gurtner G, Chang J, Lee GK: The Stanford microsurgery and resident training (SMaRT) scale: validation of an on-line global rating scale for technical assessment. Ann Plast Surg 2014; 72(suppl 1):S84–S88.
  63. Hori T, Nguyen JH, Zhao X, Ogura Y, Hata T, Yagi S, Chen F, Baine AM, Ohashi N, Eckman CB, Herdt AR, Egawa H, Takada Y, Oike F, Sakamoto S, Kasahara M, Ogawa K, Hata K, Iida T, Yonekawa Y, Sibulesky L, Kuribayashi K, Kato T, Saito K, Wang L, Torii M, Sahara N, Kamo N, Sahara T, Yasutomi M, Uemoto S: Comprehensive and innovative techniques for liver transplantation in rats: a surgical guide. World J Gastroenterol 2010; 16:3120–3132.
  64. Starkes JL, Payk I, Hodges NJ: Developing a standardized test for the assessment of suturing skill in novice microsurgeons. Microsurgery 1998; 18:19–22.
  65. Jin H, Huang H, Dong W, Sun J, Liu A, Deng M, Dirsch O, Dahmen U: Preliminary experience of a PDCA-cycle and quality management based training curriculum for rat liver transplantation. J Surg Res 2012; 176:409–422.
  66. Reiley CE, Lin HC, Yuh DD, Hager GD: Review of methods for objective surgical skill evaluation. Surg Endosc 2011; 25:356–366.
  67. Van Nortwick SS, Lendvay TS, Jensen AR, Wright AS, Horvath KD, Kim S: Methodologies for establishing validity in surgical simulation studies. Surgery 2010; 147:622–630.
  68. Saleh GM, Voyatzis G, Hance J, Ratnasothy J, Darzi A: Evaluating surgical dexterity during corneal suturing. Arch Ophthalmol 2006; 124:1263–1266.
  69. Saleh GM, Gauba V, Sim D, Lindfield D, Borhani M, Ghoussayni S: Motion analysis as a tool for the evaluation of oculoplastic surgical skill: evaluation of oculoplastic surgical skill. Arch Ophthalmol 2008; 126:213–216.
  70. Grober ED, Roberts M, Shin EJ, Mahdi M, Bacal V: Intraoperative assessment of technical skills on live patients using economy of hand motion: establishing learning curves of surgical competence. Am J Surg 2010; 199:81–85.
  71. Shah A, Rowlands M, Patel A, Fusi S, Salomon J: Ubersense: using a free video analysis app to evaluate and improve microsurgical skills. Plast Reconstr Surg 2014; 134:338e–339e.
  72. McGoldrick RB, Davis CR, Paro J, Hui K, Nguyen D, Lee GK: Motion analysis for microsurgical training: objective measures of dexterity, economy of movement, and ability. Plast Reconstr Surg 2015; 136:231e–240e.
  73. Harada K, Morita A, Minakawa Y, Baek YM, Sora S, Sugita N, Kimura T, Tanikawa R, Ishikawa T, Mitsuishi M: Assessing microneurosurgical skill with medico-engineering technology. World Neurosurg 2015; 84:964–971.
  74. Wanzel KR, Hamstra SJ, Anastakis DJ, Matsumoto ED, Cusimano MD: Effect of visual-spatial ability on learning of spatially-complex surgical skills. Lancet 2002; 359:230–231.
  75. Murdoch JR, Bainbridge LC, Fisher SG, Webster MH: Can a simple test of visual-motor skill predict the performance of microsurgeons? J R Coll Surg Edinb 1994; 39:150–152.
    External Resources
  76. Horváth A, Valalik I, Csokay A: Suture of minimal-diameter vessels using fingertip support technique. J Hand Microsurg 2013; 5:44–45.
Figures

Tables
Thumbnail
Thumbnail
Thumbnail