SciELO - Scientific Electronic Library Online

 
vol.48 issue3Update on biological risk for anesthetists taking care of patients affected by SARS-CoV2, COVID19Utility of biomarkers in traumatic brain injury: a narrative review author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • On index processCited by Google
  • Have no similar articlesSimilars in SciELO
  • On index processSimilars in Google

Share


Colombian Journal of Anestesiology

Print version ISSN 0120-3347On-line version ISSN 2256-2087

Rev. colomb. anestesiol. vol.48 no.3 Bogotá July/Sept. 2020  Epub Oct 15, 2020

https://doi.org/10.1097/cj9.0000000000000158 

Nonsystematic Review

Anesthesia assessment in the era of competences: state of the art

Sandra Ximena Jaramillo-Rincóna  b  * 

Eduardo Durantec 

Roberta Ladenheimd  e 

Juan Carlos Díaz-Cortésf 

a Clínica de Marly, Bogotá, Colombia.

b Universidad de los Andes, Bogotá, Colombia.

c Instituto Universitario Hospital Italiano de Buenos Aires, Buenos Aires, Argentina.

d Hospital Universitario CEMIC, Buenos Aires, Argentina.

e Ministry of Health and Social Development of the Republic of Argentina, Buenos Aires, Argentina.

f Clínica de Marly Jorge Cavelier Gaviria, Chía, Colombia.


Abstract

Introduction:

Anesthesiology requires procedure fulfillment, problem, and real-time crisis resolution, problem, and complications forecast, among others; therefore, the evaluation of its learning should center around how students achieve competence rather than solely focusing on knowledge acquisition. Literature shows that despite the existence of numerous evaluation strategies, these are still underrated in most cases due to unawareness.

Objective:

The present article aims to explain the process of competency-based anesthesiology assessment, in addition to suggesting a brief description of the learning domains evaluated, theories of knowledge, instruments, and assessment systems in the area; and finally, to show some of the most relevant results regarding assessment systems in Colombia.

Methodology:

The results obtained in "Characteristics of the evaluation systems used by anesthesiology residency programs in Colombia" showed a certain degree of unawareness by stakeholders in the educational process, a fact that motivated the publishing of this discussion around the topic of competency-based assessment in anesthesiology. Following a bibliography search with the keywords through PubMed, OVID, ERIC, DIALNET, and REDALYC, 110 articles were reviewed and 75 were established as relevant for the research's theoretical framework.

Results and conclusion:

Anesthesiology assessment should be conceived from the competency's multidimensionality; it must be longitudinal and focused on the learning objectives.

Keywords: Educational Assessments; Competency-Based Education; Outcome and Process Assessment (Health Evaluation); Professional competence; Anesthesiology

Resumen

Introducción:

La anestesiología requiere la realización de procedimientos, resolución de problemas y crisis en tiempo real, previsión de problemas y complicaciones, entre otros, por lo tanto, la evaluación de su aprendizaje debería centrarse en cómo el estudiante alcanza la competencia y no solo en la adquisición de conocimientos. La literatura muestra que, a pesar de existir numerosas estrategias de evaluación, estas continúan siendo subvaloradas en muchos casos por desconocimiento.

Objetivo:

Este artículo pretende dar a conocer el proceso de evaluación en la anestesiología desde la competencia, además de sugerir una breve descripción de los dominios y teorías de aprendizaje, instrumentos y sistemas de evaluación en esta área y, finalmente, mostrar algunos de los resultados más relevantes sobre los sistemas de evaluación en Colombia.

Metodología:

Tras una búsqueda bibliográfica en PubMed, OVID, ERIC, DIALNET, REDALYC, con las palabras clave, se revisaron 110 artículos de los cuales 75 fueron considerados relevantes para elaborar el marco teórico de la investigación.

Resultados y conclusiones:

La evaluación en anestesiología debe ser concebida desde la multidimensionalidad de la competencia, ser longitudinal y enfocada en los objetivos de aprendizaje.

Palabras clave: Evaluación educacional; Educación basada en competencias; Evaluación por competencias; Anestesiología

Introduction

Education systems have changed dramatically over the past 20 years. The technification of science, the influx of information, and the influence of economics on scientific development have promoted a change not only in institutions, but also in ways of thinking, which is expressed through changes in teaching and assessment methods as a result of the understanding of differences in how children and adults learn.

Performance assessment should be a dynamic, systematic, and structured process that identifies and involves assessment objectives, the selection and use of multiple tools and instruments according to those objectives, and the application of behaviors derived from this process to optimize and guide learning.1,2

After a literature search spanning from 1999 to 2017 focused on the assessment process of anesthesiology students' performance, which attempted to describe its theoretical and pedagogical foundations, educational principles, assessment tools, and implementation strategies from the concept of programmatic assessment and assessment for learning in this practical knowledge area, 110 articles were reviewed and 73 were considered relevant for the review (Table 1, Fig. 1).

Table 1 Characteristics of the studies selected for the non-systematic review. 

Source: Authors.

Source: Authors.

Figure 1 Literature search and selection criteria. 

Below, readers will find the most relevant results of this narrative (non-systematic) literature review. Initially, there is an explanation of how the concept of competency has modified the anesthesiology assessment process over the past two decades, through a brief description of the domains and learning theories applied in anesthesiology. Finally, there are the assessment instruments and systems currently recommended for performance assessment of anesthesiology graduate students.

Anesthesiology assessment

Over their professional lives, anesthesiologists develop a number of complex skills that must be learned during training and honed with practice. Teachers have a responsibility to know what skills they should teach, how to do it, when to delegate responsibilities and when a resident is able to deal with the real world in unsupervised conditions.58

Some authors propose to work on the classification proposed by Gaba et al59 based on the concept of "Situation Awareness", which describes three basic aspects anesthesiologists should develop during training for conscious decision-making: interpretation of subtle signals, interpretation and management of evolving situations and special knowledge application.58,59,83

Gaba et al classify the competencies in which anesthesiologists should be trained in both technical and nontechnical skills.6,7,58,59 The term "technical skills" refers to the execution of actions based on medical knowledge and technical perspective, focused on the control of the body and thought (Table 2).38 The most studied are orotracheal intubation, vascular catheterization, regional anesthesia, crisis management, pain management, patient assessment, and critical care management.60,71

Table 2 Poulton's classification of technical skills.38  

Source: Authors.

The concept of non-technical skills refers to the development of cognitive and social skills and personal resources that enable safe and efficient task performance.6,80 The acquisition of these types of skills40 is what decreases the possibility of error and adverse events in patient care7 (Fig. 2).

Source: Adapted from non-technical skills for anaesthetists: developing and applying ANTS.80 Authorized by Rhona Flin.

Figure 2 Non-technical skills based on the ANTS system (Anaesthetists' non-technical skills). 

Currently, there are multiple theoretical frameworks focused on the application of different competency-based models (ACGME, CanMEDS, Union of European Medical Specialists [UEMS], SCARE, etc.), which have reached different development and search ranges (Table 3).18-22,30 The most current vision is perhaps the approach based on the entrustable professional activities proposed by Ten Cate since 2010, still under in-depth study in this area of medicine.27-29

Table 3 Anesthesia assessment competencies and domains. 

Source: Authors.

Exchange of information Assertive exercise of authority Capacity Assessment Support to other team members

How is it assessed in anesthesiology?

Purpose of the assessment

For years, assessment in anesthesia has focused on summative competency assessment related to clinical practice, patient interaction, and critical situation analysis, often at the end of rotations. Currently, it is proposed to emphasize real-time, sequential, and progressive process assessment and learning individualization, as well as the relevance of feedback within this process.4-6

Content of the assessment

The assessment of technical and non-technical performance should have equal weight when establishing judgment.59,70,75) Traditionally, assessment in anesthesia has been limited to theoretical knowledge tests as the main source of information, coupled with unstructured direct observation of daily work and isolated logs of information without feedback focused on technical skill acquisition (Fig. 3).

Figure 3 Trends in anesthesiology performance assessment by skill type. 

A study by Ross et al found that most assessments were related to "patient care" and "medical knowledge" competencies (patient care, anesthetic plan and behavior [35%], followed by use and interpretation of monitoring and equipment [8.5%]). 10.2% were related to practice-based learning and improvement, most commonly self-directed learning (6.8%); and 9.7% were related to system-based practice competency.11

Assessment tools

Although the educational literature supports the usefulness of multiple tools to assess performance, habit leads to using a single tool to define performance (Global Rotation Assessment and multiple-choice tests).4 This type of assessment suffers from the known limitation of the use and inadequate interpretation of scales, subjective performance assessments and the "halo" effect, where the result is determined by what is known to have occurred in the past.4-6

Despite the interest of the European Society of Anaesthesiology (the UEMS/European Board of Anaesthesia) to harmonize assessment and certification tools for anesthesia programs in Europe,20-22,24 a recent study in the European Union10 found that the assessment and certification processes for anesthesia specialist training were diverse. In many countries the traditional time-based learning model remains active, with an average duration of 5 years (range 2.75-7). The programs with the greatest number of assessment tools were competency-based (mean 9.1 [SD 2.97] vs. 7.0 [SD 1.97]; p 0.03). The most frequently mentioned tools were direct clinical observation, feedback, oral questions and/or multiple-choice tests procedure log and portfolio. Most countries had a certification process at the national level.

Some competency-based anesthesia programs, such as that of the University of Ottawa in Canada,7 suggest simplified assessment tools like those used in the oral exams of the Royal College of Physicians and Surgeons of Canada, which serve to evaluate residents' medical knowledge and critical thinking associated with questions intended to guide their learning.

Simulation-based assessment is perhaps one of the most evidence-based tools to acquire competencies in the management of simulated intraoperative events; however, further studies are needed to determine its validity in terms of clinical performance and knowledge transfer.60,62

Some studies focused on measuring the effectiveness and validity of methods, such as Script, ECOE, Mini-CEX, and DOPS, among others, have shown their usefulness to assess graduate anesthesiology students, but at a higher cost (4-6.65).

In parallel with the difficulties of applying these "novel" assessment methods within anesthesia practice, Tetzlaff demonstrates the virtues of problem-based assessment for the acquisition and assessment of both technical and non-technical competencies at a more reasonable cost.4-6

To summarize, anesthesiology assessment is characterized in most cases by the gap between what must be evaluated and what is ultimately evaluated, giving greater importance to theoretical knowledge assessment of procedural skills and clinical judgment at the time of assessing residents. This prevents the proper assessment of the competency level reached by the student, as it does not entail a general assessment.

The advent of multiple assessment instruments designed under the precept of "evaluating to learn" and the assessment usefulness formula proposed by Van der Vleuten et al36 show that anesthesiology has a significant gap between the application of such instruments in specific teaching situations and competency-based assessment in this specialty10-17,25-32,37-46,56,57,60-67,71-78,82-89) (Table 4).

Table 4 Anesthesia assessment instruments according to the Van der Vleuten Equation. 

* CUSUM curves or cumulative learning curves. CUSUM graphs are models that evaluate the success rate in the accomplishment of a task over time, considering the assessment method's possibilities of failure from the point of view of type 1 and type 2 errors, and of the skill to be properly evaluated from the point of view of the acceptable and unacceptable probability of failure. In anesthesiology, CUSUM charts have been used not only to assess psychomotor learning but also to describe its evolution over time in both trained and untrained individuals. The most frequently assessed procedures are: orotracheal intubation (OTI), vascular catheterization, and regional anesthesia.39

Source: Authors.

The need to assess anesthesiology residents in the clinical setting is evident; however, there is no consensus on the use, let alone the selection of the best strategy to assess their performance and learning. Although the assessment trend is focused on the "patient care" and "medical knowledge" competencies, there is great interest in other types of tools and instruments in other competencies within the AGCME and CanMEDS theoretical framework. For example, to assess non-technical skills, the University of Aberdeen in Scotland designed the Anaesthetists Non-Technical Skills tool, currently incorporated by the UK's Royal College of Anaesthetists for the routine assessment of anesthesia residents and as a possible national selection tool for future anesthesiologists.80,87

Considering that the development of an appropriate programmatic assessment system must start from the fact that there is no single type of assessment method or tool that is intrinsically superior or sufficient to assess all competencies, regardless of the proposed curricular model, programs should ensure the design and implementation of assessment methods that are consistent with the curricular philosophy according to their priorities and learning objectives.

Conclusion

The analysis of assessment from its educational impact and historical development indicates that the way assessment is carried out substantially influences the change in students' learning styles; hence the importance of not making assessment an isolated measure of student performance.

Competency is content or context specific, and therefore more than one method or measurement is required to assess it, in addition to being appropriate to the learning level.13 This highlights the importance of an assessment program that includes-in a structured manner and in line with the curriculum philosophy-the use of multiple instruments to obtain the greatest amount of data and attributes related to student performance.

It is easier to recognize a competency when it has been developed than when it is absent,33 so it is important to assess all aspects of training, particularly in areas where procedural skill acquisition appears to be of most importance and less attention is paid to the acquisition of the trainees' other professional skills.

Programs and teachers have the responsibility to define the complex competencies and skills to be learned35,36 and how to teach and evaluate them, to recognize when to delegate responsibilities and when the resident can face the real world in unsupervised conditions.

Every day there are more resources to turn assessment into a transformative tool for learning. Today there are multiple competency measurement instruments based on the traditional Miller pyramid which allow to assess both technical and non-technical skills based on residents' process and progress, to apply the often-discussed concept of student individualization more broadly.13,33,90

Although there is still a long way to go in the area of anesthesia, there is great concern for perfecting and studying the impact of other types of tools and instruments in specific scenarios of the specialty. Curricular reforms, a change of vision and the professionalization of the medical discipline have expanded the room for improvement in the teaching area, as well as the application of new assessment strategies and instruments that could be positive and increase the likelihood of "significant learning" in anesthesiology residents.

Ethical responsibilities

The article is based on and follows the "Scientific, technical and administrative standards for health research" established in Resolution 8430 of 1993 of the Ministry of Health of the Republic of Colombia. The published study was deemed as a low-risk research, which required written informed consent as the study used private documents, as well as opinions and personal data, the use of which may cause psychological and/or social changes or human behavior modifications.

Acknowledgments

Study assistance: none.

References

1. Van Der Vleuten CP. Revisiting assessing professional competence: from methods to programmes. Med Educ 2016;50:885-888. [ Links ]

2. Celman S. ¿Es posible mejorar la evaluación y transformarla en herramienta de conocimiento? En Camilloni AW: La evaluación de los aprendizajes en el debate didáctico contemporáneo. Paidos, Buenos Aires; 1998. [ Links ]

3. Colbert CY, Dannefer EF, French JC. Clinical competency committees and assessment: changing the conversation in graduate medical education. J Grad Med Educ 2015;7:162-165. [ Links ]

4. Tetzlaff JE. Assessment of competency in anesthesiology. Anesthesiology 2007;106:812-825. [ Links ]

5. Tetzlaff JE. Assessment of competence in anesthesiology. Curr Opin Anaesthesiol 2009;22:809-813. [ Links ]

6. Tezlaff JE. Evaluation of Anesthesia Residents. En Frost EAM. Comprehensive Guide to Education in Anesthesia. 2014;Springer, New York:129-145. [ Links ]

7. Fraser AB, Stodel EJ, Jee R, et al. Preparing anesthesiology faculty for competency-based medical education. Surv Anesthesiol 2017;61:32-33. [ Links ]

8. Bould MD, Naik VN, Hamstra SJ. Review article: new directions in medical education related to anesthesiology and perioperative medicine. Can J Anesth 2012;59:136-150. [ Links ]

9. Boet S, Pigford AAE, Naik VN. Program director and resident perspectives of a competency-based medical education anesthesia residency program in Canada: a needs assessment. Korean J Med Educ 2016;28:157-168. [ Links ]

10. Jonker G, Manders L, Marty A, et al. Variations in assessment and certification in postgraduate anaesthesia training: a European survey. Br J Anaesth 2017;119:1009-1014. [ Links ]

11. Ross FJ, Metro DG, Beaman ST, et al. A first look at the Accreditation Council for Graduate Medical Education anesthesi-ology milestones: implementation of self-evaluation in a large residency program. J Clin Anesth 2016;32:17-24. [ Links ]

12. Yamamoto S, Tanaka P, Madsen MV, et al. Comparing anesthesiology residency training structure and requirements in seven different countries on three continents. Cureus 2017;9:e1060. [ Links ]

13. Durante E. Algunos métodos de evaluación de las competencias: Escalando la pirámide de Miller. Revista del Hospital Italiano 2006;55-61. [ Links ]

14. Ebert TJ, Fox CA. Competency-based education in anesthesiology: history and challenges. Anesthesiology 2014;120:24-31. [ Links ]

15. Frost E. Comprehensive Guide to Education in Anesthesia. New York: Springer science + business Media; 2014. [ Links ]

16. Baker K. Determining resident clinical performance: getting beyond the noise. Anesthesiology 2011;115:862-878. [ Links ]

17. Boulet JR, Murray D. Review article: assessment in anesthesiology education. J Can Anesth 2012;59:182-192. [ Links ]

18. The Accreditation Council for Graduate Medical Education and The American Board of Anesthesiology. The Anesthesiology Milestone Project. [Internet]. Accreditation Council for Graduate Medical Education. [Cited 2020 May 10]. Available at: https://www.acgme.org/. [ Links ]

19. The Royal College of Physicians and Surgeons of Canada [Internet]. Royal College of Physicians and Surgeons of Canada. [Cited 2020 May 12]. Available at: http://www.royalcollege.ca/rcsite/canmeds-e. [ Links ]

20. The Standing Committee on Education and Professional Development of the Section and Board of Anaesthesiology. European Training Requirement ETR in Anesthesiology [Internet]. Available at: https://www.uems.eu/about-us/medical-specialties. [ Links ]

21. Larsson J, Holmström I. Understanding anesthesia training and trainees. Curr Opin Anaesthesiol 2012;25:681-685. [ Links ]

22. Gessel EV, Mellin-Olsen J, 0stergaard HT, et al. Postgraduate training in anaesthesiology, pain and intensive care. Eur J Anaesthesiol 2012;29:165-168. [ Links ]

23. Chiu M, Crooks S, TarshisJ, et al. Simulation-based assessment of anesthesiology residents’ competence: development and implementation of the Canadian National Anesthesiology Simulation Curriculum (CanNASC). Can J Anesth 2016;63:1357-1363. [ Links ]

24. Carlsson C, Keld D, Van Gessel E, et al. Education and training in anaesthesia - revised guidelines by the European Board of anaesthesiology, reanimation and intensive care: SECTION and BOARD OF ANAESTHESIOLOGY1, European Union of Medical specialists. Eur J Anaesthesiol 2008;25:528-530. [ Links ]

25. Rebel A, Dilorenzo A, Nguyen D, et al. Should objective structured clinical examinations assist the clinical competency committee in assigning anesthesiology milestones competency? Anesth Analg 2019;129:226-234. [ Links ]

26. Cate OT. Entrustability of professional activities and competency-based training. Med Educ 2005;39:1176-1177. [ Links ]

27. Cate OT. Nuts and bolts of entrustable professional activities. J Grad Med Educ 2013;5:157-158. [ Links ]

28. Wisman-Zwarter N, Schaaf MVD, Cate OT, et al. Transforming the learning outcomes of anaesthesiology training into entrustable professional activities. Eur J Anaesthesiol 2016;33:559-567. [ Links ]

29. Jonker G, Hoff RG, Cate OTJT. A case for competency-based anaesthesiology training with entrustable professional activities. Eur J Anaesthesiol 2015;32:71-76. [ Links ]

30. The Anesthesiology Milestone Project. Assessment of procedural skills in anesthesiology trainees: changing trends. J Grad Med Educ 2014;6 (1 suppl 1):15-28. [ Links ]

31. Sivaprakasam J, Purva M. CUSUM analysis to assess competence: what failure rate is acceptable? Clin Teach 2010;7:257-261. [ Links ]

32. Neira VM, Bould MD, Nakajima A, et al. GIOSAT: a tool to assess CanMEDS competencies during simulated crises. Can J Anesth 2013;60:280-289. [ Links ]

33. Jaramillo S, Vargas R. Cañadas R, Vargas R, Rincon R, et al. Cómo aprenden los adultos: una Aproximación desde la enseñanza médica. Currículo nuclear en endoscopia digestiva: fundamentos teóricos y propuesta curricular Bogotá: Panamericana; 2018; 15-25. [ Links ]

34. Sociedad Colombiana de Anestesiología y Reanimación Docu-mento marco del Plan de Estudios y Competencias para un Programa de Anestesiología en Colombia. Bogotá: SCARE; 2017. [ Links ]

35. Van der Vleuten C, Schuwirth L, Driessen E, et al. A model for programmatic assessment fit for purpose. Med Teach 2012;34:205-214. [ Links ]

36. Van Der Vleuten CPM, Schuwirth LWT, Scheele F, et al. The assessment of professional competence: building blocks for theory development. Best Pract Res Clin Obstet Gynaecol 2010; 24:703-719. [ Links ]

37. Bilotta F, Titi L, Lanni F, et al. Training anesthesiology residents in providing anesthesia for awake craniotomy: learning curves and estimate of needed case load. J Clin Anesth 2013;25:359-366. [ Links ]

38. Ramírez LJ, Moreno MA, Gartdner L, et al. Modelo de enseñanza de las habilidades psicomotoras básicas en anestesia para estudiantes de ciencias de la salud: sistematización de una experiencia. Colombian Journal of Anesthesiology 2008;36:85-92. [ Links ]

39. Aguirre Ospina OD, Ríos Medina ÁM, Calderón Marulanda M, et al. Cumulative Sum learning curves (CUSUM) in basic anaesthesia procedures. Colombian Journal of Anesthesiology 2014;42:142-153. [ Links ]

40. Stiegler MP, Tung A. Cognitive processes in anesthesiology decision making. Anesthesiology 2014;120:204-217. [ Links ]

41. Enser M, Moriceau J, Abily J, et al. Background noise lowers the performance of anaesthesiology residents’ clinical reasoning when measured by script concordance. Eur J Anaesthesiol 2017;34:464-470. [ Links ]

42. Echevarría Moreno M, Prieto Vera C, Martin Telleria A, et al. The objective structured clinical evaluation of teaching in anaesthesiology and resuscitation. Rev Esp Anestesiol Reanim 2012;59:134-141. [ Links ]

43. Ben-Menachem E, Ezri T, Ziv A, et al. Objective structured clinical examination-based assessment of regional anesthesia skills: the israeli national board examination in anesthesiology experience. Anesth Analg 2011;112:242-245. [ Links ]

44. Ahmed O, O’Donnell B, Gallagher A, et al. Development of performance and error metrics for ultrasound-guided axillary brachial plexus block. Adv Med Educ Pract 2017;5:257-263. [ Links ]

45. CheungJJH, Chen EW, Darani R, et al. The creation of an objective assessment tool for ultrasound-guided regional anesthesia using the delphi method. Reg Anesth Pain Med 2012;37:329-333. [ Links ]

46. Chin KJ, Tse C, Chan V, et al. Hand motion analysis using the Imperial College surgical assessment device: validation of a novel and objective performance measure in ultrasound-guided peripheral nerve blockade. Reg Anesth Pain Med 2011;36:213-219. [ Links ]

47. Chuan A, Thillainathan S, Graham PL, et al. Reliability of the direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia. Anaesth Intensive Care 2016;44:201-209. [ Links ]

48. Watson MJ, Wong DM, Kluger R, et al. Psychometric evaluation of a direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia. Anaesthesia 2014;69: 604-612. [ Links ]

49. Laurent DA, Niazi A, Cunningham M, et al. A valid and reliable assessment tool for remote simulation-based ultrasound-guided regional anesthesia. Reg Anesth Pain Med 2014;39:496-501. [ Links ]

50. Corvetto MA, Fuentes C, Araneda A, et al. Validation of the imperial college surgical assessment device for spinal anesthesia. BMC Anesthesiol 2017;17:131. [ Links ]

51. Chuan A, Wan AS, Royse C, et al. Competency-based assessment tools for regional anaesthesia: a narrative review. Br J Anaesth 2017;120:264-273. [ Links ]

52. Chuan A, Graham PL, Wong DM, et al. Design and validation of the Regional Anaesthesia Procedural Skills Assessment Tool. Anaesthesia 2015;70:1401-1411. [ Links ]

53. Hastie MJ, Spellman JL, Pagnano PP, et al. Designing and implementing the objective structured clinical examination in anesthesiology. Anesthesiology 2014;120:196-203. [ Links ]

54. Farnan JM, Petty LA, Georgitis E, et al. A systematic review: the effect of clinical supervision on patient and residency education outcomes. Acad Med 2012;87:428-442. [ Links ]

55. Moore DL, Ding L, Sadhasivam S. Novel real-time feedback and integrated simulation model for teaching and evaluating ultrasound-guided regional anesthesia skills in pediatric anesthesia trainees. Paediatr Anaesth 2012;22:847-853. [ Links ]

56. Riveros R, Kimatian S, Castro P, et al. Multisource feedback in professionalism for anesthesia residents. J Clin Anesth 2016; 34:32-40. [ Links ]

57. Smith SE, Tallentire VR. The right tool for the right job: the importance of CUSUM in self-assessment. Anaesthesia 2011; 66:747. [ Links ]

58. Gaba DM, Howard SK, Flanagan B, et al. Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Anesthesiology 1998;89:8-18. [ Links ]

59. Gaba DM, Howard SK, Gan BF, et al. Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Surv Anesthesiol 1999;43:111-112. [ Links ]

60. Murray DJ, Boulet JR, Avidan M, et al. Performance of residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology 2007;107:705-713. [ Links ]

61. Murray DJ, Boulet JR, Kras JF, et al. A simulation-based acute skills performance assessment for anesthesia training. Anesth Analg 2005;101:1127-1134. [ Links ]

62. Rábago JL, López-Doueil M, Sancho R, et al. Learning outcomes evaluation of a simulation-based introductory course to anaesthesia. Rev Esp Anestesiol Reanim 2017;64:431-440. [ Links ]

63. Fehr JJ, Boulet JR, Waldrop WB, et al. Simulation-based assessment ofpediatric anesthesia skills. Anesthesiology2011;115:1308-1315. [ Links ]

64. Lammers RL, Davenport M, Korley F, et al. Teaching and assessing procedural skills using simulation: metrics and methodology. Acad Emerg Med 2008;15:1079-1087. [ Links ]

65. Rebel A, DiLorenzo A, Fragneto RY, et al. Objective assessment of anesthesiology resident skills using an innovative competition-based simulation approach. A A Case Rep 2015;5:79-87. [ Links ]

66. Schwid HA, Rooke GA, Carline J, et al. Evaluation of anesthesia residents using mannequin-based simulation: a multiinstitutional study. Anesthesiology 2002;97:1434-1444. [ Links ]

67. Byrne AJ, Greaves JD. Assessment instruments used during anaesthetic simulation: review ofpublished studies. Br J Anaesth 2001; 86:445-450. [ Links ]

68. Corvetto MA, Bravo MP, Montaña RA, et al. Bringing clinical simulation into an anesthesia residency training program in a university hospital. Participants’ acceptability assessment. Rev Esp Anestesiol Reanim 2013;60:320-326. [ Links ]

69. Hindman BJ, Dexter F, Smith TC. Anesthesia residents’ global (departmental) evaluation of faculty anesthesiologists’ supervision can be less than their average evaluations of individual anesthesiologists. Anesth Analg 2015;120:204-208. [ Links ]

70. Mitchell JD, Holak EJ, Tran HN, et al. Are we closing the gap in faculty development needs for feedback training? J Clin Anesth 2013;25:560-564. [ Links ]

71. O’Sullivan O, Shorten GD. Formative assessment of ultrasound-guided regional anesthesia. Reg Anest Pain Med 2011;36: 522-523. [ Links ]

72. Hindman BJ, Dexter F, Kreiter CD, et al. Determinants, associations, and psychometric properties of resident assessments of anesthesiologist operating room supervision. Anesth Analg 2013;116:1342-1351. [ Links ]

73. Bindal N, Goodyear H, Bindal T, et al. DOPS assessment: a study to evaluate the experience and opinions of trainees and assessors. Med Teach 2013;35:e1230-e1234. [ Links ]

74. De Oliveira Filho GR, Dal Mago AJ, Garcia JHS, et al. An instrument designed for faculty supervision evaluation by anesthesia residents and its psychometric properties. Anesth Analg 2008;107:1316-1622. [ Links ]

75. Ahmed K, Miskovic D, Darzi A, et al. Observational tools for assessment of procedural skills: a systematic review. Am J Surgery 2011;202:469-80e6. [ Links ]

76. Rebel A, DiLorenzo AN, Fragneto RY, et al. A competitive objective structured clinical examination event to generate an objective assessment of anesthesiology resident skills development. A Case Rep 2016;6:313-319. [ Links ]

77. Norris A, McCahon R. Cumulative sum (CUSUM) assessment and medical education: a square peg in a round hole. Anaesthesia 2011;66:250-254. [ Links ]

78. Khaliq T. Reliability of results produced through objectively structured assessment of technical skills (OSATS) for endotracheal intubation (ETI). J Coll Physicians Surg Pak 2013;23:51-55. [ Links ]

79. Flin R, Patey R, Glavin R, et al. Anaesthetists’ non-technical skills. Br J Anaesth 2010;105:38-44. [ Links ]

80. Flin R, Patey R. Non-technical skills for anaesthetists: developing and applying ANTS. Best Pract Res Clin Anaesthesiol 2011;25:215-227. [ Links ]

81. Graham J, Hocking G, Giles E. Anaesthesia non-technical skills: can anaesthetists be trained to reliably use this behavioural marker system in 1 day? Br J Anaesth 2010;104:440-445. [ Links ]

82. Ahmed A. Assessment of procedural skills in anesthesiology trainees: changing trends. Anaesth, Pain & Intensive Care 2014;18:135-36. [ Links ]

83. Bould MD, Crabtree NA, Naik VN. Assessment of procedural skills in anesthesia. Br J Anaesth 2009;103:472-483. [ Links ]

84. Witt A, Iglesias S, Ashbury T. Evaluation of Canadian family practice anesthesia training programs: can the resident logbook help? Can J Anesth 2012;59:968-973. [ Links ]

85. Weller JM, Castanelli DJ, Chen Y, et al. Making robust assessments of specialist trainees’ workplace performance. Br J Anaesth 2017;118:207-214. [ Links ]

86. Weller JM, Jones A, Merry AF, et al. Investigation of trainee and specialist reactions to the mini-clinical evaluation exercise in anaesthesia: implications for implementation. Br J Anaesth 2009;103:524-530. [ Links ]

87. Kathirgamanathan A, Woods L. Educational tools in the assessment of trainees in anaesthesia. Contin Educ Anaesth Crit Care Pain 2011;11:138-142. [ Links ]

88. Castanelli DJ, Castanelli DJ, Jowsey T, et al. Perceptions of purpose, value, and process of the mini-clinical evaluation exercise in anesthesia training. Can J Anesth 2016;63:1345-1356. [ Links ]

89. Colbert-Getz J, Ryan M, Hennessey E, et al. Measuring assessment quality with an assessment utility rubric for medical education. MedEdPORTAL 2017;13:10588. [ Links ]

90. Van Meeuwen LW, Brand-Gruwel S, Kirschner PA, et al. Fostering self-regulation in training complex cognitive tasks. Educ Tech Res Dev 2018;66:53. [ Links ]

How to cite this article: Jaramillo-Rincon SX, Durante E, Ladenheim R, Díaz-Cortés JC. Anesthesia assessment in the era of competences: state of the art. Case report. Colombian Journal of Anesthesiology. 2020;48:145-154.

Copyright © 2020 Sociedad Colombiana de Anestesiología y Reanimación (S.C.A.R.E.). Published by Wolters Kluwer. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/).

Funding The authors have no funding to disclose.

Conflicts of interest The authors have no conflicts of interest to disclose.

* Correspondence: Clínica de Marly, Calle 50 # 9-67, Surgery Rooms, Bogotá, Colombia. E-mail: sx.jaramillo@uniandes.edu.co

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License