Skip to main content

Impact of simulation debriefing structure on knowledge and skill acquisition for postgraduate critical care nursing students: three-phase vs. multiphase

Abstract

Background

Simulation is part of the training provided to nurses enrolled in the master’s degree for critical care nursing programmes at our institution. Although the students are practicing nurses, many still make mistakes when performing nursing procedures related to critical care during simulation sessions, and these mistakes must be addressed during the debriefing session. The aim of the study is to compare the knowledge and skills acquired by groups of postgraduate critical care nursing students who were exposed to high-fidelity simulation (HFS) by using different debriefing structures.

Methods

A quasi-experimental crossover design was utilised during the post-tests and objective structured clinical examinations (OSCEs). The students were divided into two groups: one was exposed to HFS with a 3-phase debriefing, and the other was exposed to HFS with a multiphase debriefing. Both groups involved facilitator-guided and video-assisted debriefings.

Results

Overall, the post-test scores (p-value: Phase 1 = 0.001 and Phase 2 = 0.000) and post-OSCE scores (p-value: Phase 1 = 0.002 and Phase 2 = 0.002) support that the group of postgraduate students who underwent HFS with a multiphase debriefing structure gained significantly higher scores compared to the group who underwent HFS with a 3-phase debriefing structure.

Conclusion

Debriefing is a critical component of successful simulation. Learning requires assessment that creates constructive criticism based on feedback and reflection. A multiphase debriefing structure, specifically the healthcare simulation after-action review, provides a significant advantage for knowledge and skills acquisition.

Peer Review reports

Background

Simulation is part of the clinical practicum of nurses enrolled in master’s degrees in critical care nursing programmes at Fakeeh College for Medical Sciences in Jeddah, Saudi Arabia. It enhances their current knowledge and skills in caring for critically ill patients. Knowledge and skills acquisition is crucial for postgraduate critical care nursing students because it is a crucial element of attaining competence in the field of critical care. Moreover, it nurtures personal and professional accountability. Therefore, appropriate knowledge and skills promote quality care and patient safety. In the process of learning, recognition of shortcomings and identifying improvements are pivotal aspects of knowledge acquisition and development. In the healthcare field, one of the most important strategies for ensuring that nursing students and practitioners have cultivated knowledge from simulation-based training is the application of the debriefing structure. This implies that a specific and active intervention process (primary goals of debriefing) is key to reviewing individual and team performance during simulation training. From these discussions, learners can contemplate their actions that will encourage and integrate better performance in actual medical situations. This study compared three-phase and multiphase debriefing structures because the simulation facilitators in the Clinical Skills and Simulation Center (CSSC) are not unified on which debriefing structure will be used, although they were trained on both. Identifying the best debriefing structure that produces exceptional outcomes in knowledge and skills acquisition of nurses will help achieve the goal of the simulation.

Simulation is a technique that imitates real-life situations. It is utilised in several contexts, such as an educational process for students to implement learned skills and re-enact performances before the actual delivery to the healthcare system [1, 2]. After the simulation, a follow-up procedure is needed for the reflective process of the students⁠ through debriefing procedures. Students can reflect, contemplate, and associate their physical performances during the simulation to fully grasp the cognitive interpretation of the scenario during the debriefing process [2, 3]. Generally, a debriefing conducted right after a simulation scenario is more effective than one during the simulation in terms of knowledge and confidence [4, 5]. Moreover, debriefing should be based on a structured framework; however, researchers disagree over which framework is most effective [6,7,8,9].

In the process of simulated learning experiences, debriefing emerged as the foundation of the learning experience’s success [2]. Hence, debriefing can be a significant factor in achieving efficiency in experiential learning [10,11,12]. Following a simulated or real-life scenario, debriefing allows students to reflect, explore, and provide more appropriate actions and clinical reasoning patterns to foster improved quality in the healthcare system [12].

Receiving structured debriefing is identified as a fundamental part of attaining effective learning [13,14,15]. Incorporating the debriefing process into simulation-based training provides more opportunities for enhanced learning and improves the participants’ awareness of their actions and effectiveness. As a structured process, debriefing develops students’ learning capacity; it stimulates the transfer of knowledge and improves technical and behavioural skills. Additionally, debriefing offers best practices to be integrated into the participants’ professional role to achieve safe and quality healthcare [16,17,18].

The high-fidelity simulation (HFS) that integrates debriefing is accepted as an essential component of nursing education. The training provided by these two learning processes is an essential tool to achieve advancements in the nurses’ performance in their traditional and professional roles [19,20,21,22]. During training, the practice of self-reflection skills becomes a beneficial aspect for aspiring simulation facilitators. It improves their capacity to explore emotions, allowing them to establish techniques in communication and improve their problem-solving skills to form judgements [23]. However, owing to a lack of evidence in nursing literature, limited support is available for Advanced Practice Registered Nurses or Acute Care Nurse Practitioner education to prove the efficacy of HFS and debriefing practices [19, 21, 24]. Furthermore, despite clear evidence supporting the use of simulation in nursing education, there is little confirmation regarding the most effective debriefing strategy or model for use in debriefing simulations designed for nurses in post-graduate critical care nursing programmes.

In HFS, an essential aspect of learning is the process of debriefing; thus, the core of the literature on HFS focuses predominantly on the process of debriefing and the reflection on action learning [25,26,27,28]. To attain the standards of debriefing, the International Nursing Association for Clinical Simulation and Learning highlights the currently available framework, such as gather, analyse, summarise (GAS); debriefing with good judgement; promoting excellence and reflective learning in simulation; debriefing for meaningful learning; plus-delta; 3D model of debriefing; and the outcome-present state test model of clinical reasoning. The association also set a fundamental criterion to follow: (1) qualified person(s) who facilitate a debrief must be capable and have adequate knowledge for conducting the debrief; (2) the debrief should be operated in a setting that is favourable to learning and promotes confidentiality, reliability, free-flow of communication, self-awareness, personal analysis, assessment, healthy criticism, and intrapersonal skills; (3) facilitator(s) must show commitment in providing sufficient concentrated attention during the debrief to achieve its efficacy after the simulation-based experience; (4) the foundation of the debrief should be based on a theoretical framework that represents its purpose; and (5) there should be harmony between the debrief, objectives, and outcomes of the simulation training [17].

Approximately 10 million Saudi Riyals were invested by the [institution name blinded for peer review] to construct the CSSC in March 2019. The facility has the mandate to promote the development of clinical performance and enhanced competence among undergraduate and postgraduate nursing students in compliance with the national and international standards of nursing practice. This study aimed to compare the knowledge and skills acquired by groups of postgraduate critical care nursing students exposed to HFS utilising different debriefing structures (three-phase and multiphase).

Methods

This study utilised a quasi-experimental crossover design to compare knowledge and skills acquired by the groups of postgraduate critical care nursing students exposed to HFS utilising different debriefing structures, such as three-phase and multiphase. The crossover design was utilised to establish which debriefing structure was more effective for knowledge and skills acquisition while the students experienced both structures. The design also excluded biases that might occur owing to individual competency within the allocated group [29].

The postgraduate students were divided into two groups: one was exposed to HFS with the three-phase debriefing. The GAS model, which included the phases of gathering, analysing, and summarising [30], was used for this group. By utilising this structure, the first phase (gather) created a shared mental model by providing a restatement of the simulation event. The second phase (analysis) was dedicated to the analysis of the actions during the simulation and learner-centred reflections. The final phase (summary) provided a review of the lessons learned and ensured that all teaching points and key learning objectives had been covered [31,32,33].

The other group was exposed to HFS with multiphase debriefing. The healthcare simulation after-action review (AAR) was applied to this group. Using this structure, the debriefing was conducted in seven phases represented by the acronym DEBRIEF. These phases were as follows: defining rules, explaining learning objectives, benchmarking performance, reviewing expected actions, identifying what occurred, examining why things occurred the way they did, and formalising learning [34]. Additionally, both debriefing structures were facilitator-guided and video-assisted.

Prior to the study, intensive training on technical preparation and operation was already conducted for nursing faculty members, including the researchers who conducted the current study on the available high-fidelity simulators of the institution. The training also addressed simulation clinical scenario (SCE) development, how to utilise CAE LearningSpace to run simulation scenarios, and simulation process and workshops for debriefing using the following structures: the GAS model and Healthcare Simulation AAR framework. However, during the study, only one facilitator was assigned to conduct all aspects of the simulation (pre-briefing, simulation scenario, and debriefing) for each scenario conducted. All SCEs, including objectives and stations, were formulated by the facilitators with the help of the clinical instructors and nurse preceptors in the intensive care units of the college-based hospital. They were further reviewed by the faculty members specialising in critical care nursing. Moreover, the debriefings were standardised.

Participants

Postgraduate students currently enrolled in the critical care nursing practicum course were recruited for the study. The participants were both male and female, with an age range of 27–45 years, and had bachelor of science in nursing degrees with licenses to practice nursing in Saudi Arabia. All participants were working in either government or private tertiary hospitals in Saudi Arabia as staff nurses, nurse educators, and nurse managers in the critical care unit. They had more than 3 years of experience in critical care and comprised Saudi and non-Saudi nationals.

Thirty-seven postgraduate students were enrolled in the critical care nursing practicum course, and all were given the opportunity to participate in the study. The researchers selected a total of 30 participants, 15 of whom were randomly selected for Group A and the other 15 for Group B. Their student ID numbers were written on paper and selected through a lottery draw. The researchers included 30 participants out of 37 because 4 of them refused to participate, and 3 others did not complete all phases of the HFS sessions owing to availability issues and other personal circumstances. Therefore, complete enumeration was applied.

The details and purpose of the study were explained to the participants, and informed consent was obtained in writing before the study commenced. These students were exposed to HFS sessions but utilised different debriefing structures, such as the three-phase structure that follows the GAS model and the multiphase structure using the Healthcare Simulation AAR framework.

Data collection procedure

Simulation sessions were divided into three phases: pre-briefing, simulation scenario, and debriefing. Phase 1 of the HFS exposure was conducted with three scenarios, each conducted on different days (with CAE iStan and CAE Apollo): (Sim1) managing angina and myocardial infarction, (Sim2) burns and spinal shock management, and (Sim3) trauma for fractures and amputations. Group A underwent the three phases of simulation (pre-briefing, simulation scenarios, and debriefing) using the GAS model. Group B underwent simulation sessions (including pre-briefing, simulation scenarios, and debriefing) utilising the ‘Healthcare Simulation AAR’ framework.

Phase 2 of HFS exposure was conducted with another three scenarios on different days (with CAE iStan and CAE Apollo): (Sim4) managing pneumonia with septic shock, (Sim5) managing pulseless ventricular tachycardia with defibrillation, and (Sim6) trauma for managing chest stab wounds. This time, Group A underwent simulation sessions (including pre-briefing, simulation scenarios, and debriefing) utilising the Healthcare Simulation AAR framework. Group B underwent the three phases of simulation sessions (pre-briefing, simulation scenarios, and debriefing) using the GAS model. All scenarios with the identified objectives were applied during Phases 1 and 2 of the HFS sessions (see Table 1). Furthermore, the six scenarios were created and selected based on the listed competencies in the course specifications. The shifting of debriefing structures applied to the group of participants during Phases 1 and 2 was considered to challenge the impact of three-phase and multiphase to both groups of participants in a given scenario.

Table 1 HFS sessions and durations of pre-briefing, scenario, and debriefing

Before the debriefing, the facilitator reviewed the participants’ performance through the recorded video and organised their thoughts and observations. Thereafter, the facilitator went through the steps of the debriefing structure (GAS model and Healthcare Simulation AAR framework) and rehearsed what they discussed. All materials (e.g. flipped charts, television, computer, and videos) were prepared and tested.

During the debriefing phase, following the GAS debriefing model, facilitators asked each participant to share their experience through a narrative; they were also asked to provide their emotional state and reaction while undergoing simulation sessions. Following the group sharing, facilitators reviewed and recorded the details of the sessions, provided feedback about the participants’ performances, and provided reports on their observations highlighting the correct and incorrect steps that were executed by participants. To challenge the participants’ critical thinking process, facilitators asked questions while encouraging reflection and administering redirection. During the final part of the debriefing, participants were tasked with providing a list of actions they felt were efficient and adequate for the success of the simulation sessions. From this list, participants were able to identify, discuss, and rethink the necessary actions required to improve their performance in future simulations or actual healthcare settings. In the Healthcare Simulation AAR framework, the facilitator first emphasised that the debriefing is a discussion of participants’ experiences during the simulation and not a lecture or critique; hence, all participants were encouraged to share their experiences. It was also emphasised that the purpose of debriefing was to learn from each other’s experiences. Second, the facilitator reviewed the learning objectives of the scenario with the participants. Third, they benchmarked participants’ performance and actions based on the standards written in the references used in the scenario. Fourth, they gave a brief explanation of the simulation scenario and discussed the actions that were expected from participants. Fifth, they discussed observations on what was actually happening during the scenario. Sixth, they asked a volunteer from the participants to recap what was happening during the simulation and identify the correct and incorrect actions taken to fill the performance gaps. Lastly, each participant was asked to answer the following questions briefly: ‘What went well in the scenario?’; ‘What did not go well?’; and ‘What would you do next time if you are faced with a similar situation in real life?’ These questions were based on the theory of experiential learning [35].

The knowledge acquired by participants was assessed with a post-test in each conducted scenario. These tests were situational and required critical thinking. Each test comprised 25 multiple-choice questions (MCQs) with five choices: 1 point for each question, 5 essay (open-ended) questions, and 5 points per question, for a total of 50 points on each simulation session. The test questions were prepared by the faculty members and clinical preceptors involved in the simulation training. Critical care nursing books and scenarios were used as a reference to develop questions. The test was then sent, along with a copy of the simulation scenarios, to two critical care nursing experts who checked its appropriateness and validity. A pilot test was conducted on 10 clinical preceptors from the base hospital to ensure that the exam duration was less than 45 min, and the item analysis was calculated (15 moderate MCQ questions, 3 difficult MCQ questions, 3 challenging MCQ questions, 2 moderate essay questions, 1 difficult essay question, and 1 challenging essay question) for each simulation scenario performed. The post-test reliability for each simulation scenario was tested using Cronbach’s alpha (Sim1: 0.86, Sim2: 0.89, Sim3: 0.88, Sim4: 0.82, Sim5: 0.87, and Sim6: 0.91). According to Taber, an alpha value ranging from 0.73 to 0.95 indicates high reliability [36]. The test was encoded and overseen by a computer-based exam software known as the Speedwell system.

Furthermore, the skills acquired by participants were assessed by conducting objective structured clinical examination (OSCE) in every scenario. The course coordinators and clinical instructors prepared and reviewed all the objectives and scenarios, with a list of equipment needed in each station and procedural rubrics based on the references and scenarios used during the HFS sessions. The OSCE rubrics were validated by the same critical care nursing experts who validated the post-tests. The OSCE stations were set up appropriately, according to the blueprint. All rubrics used a four-point Likert-type scale: 3 (done correctly), 2 (done incompletely), 1 (done incorrectly), and 0 (not done); the indicators used were dependent on the procedure. Raters and students were instructed regarding their roles in the OSCE task. Station numbers were assigned to laboratories. All the required equipment was arranged accordingly to avoid confusion. The standardised patients were briefed and oriented on their roles during the examination. Raters comprised the clinical instructors of Fakeeh College for Medical Sciences, while the invited preceptors were staff nurses of Dr. Soliman Fakeeh Hospital. Every OSCE station had two raters, and they were blinded to the student groups. The post-OSCE reliability was tested using Kappa coefficients (Sim1: 0.91, Sim2: 0.87, Sim3: 0.91, Sim4: 0.86, Sim5: 0.91, and Sim6: 0.89). According to Cohen, a Kappa result ranging from 0.81 to 1.00 indicates almost perfect agreement [37, 38]. The OSCE checklist was also encoded in the institution’s Speedwell system.

Data analysis

The post-test and OSCE scores were automatically calculated and generated in the Speedwell system. The test scores of all students were downloaded and summarised for data analysis. The collected data were encoded and analysed using IBM SPSS 2020 software, while mean and standard deviations were used to summarise the data. The t-test was used to compare the post-test scores and OSCE scores for the group of postgraduates exposed to HFS by utilising different debriefing structures. A normality test was performed as a basis to utilise parametric analysis for comparing the two groups of participants. The post-test had a skewness value of 1.85 and a standard error (SE) of 0.68, whereas the post-OSCE had a skewness value of -0.41 with an SE of 0.12, accordingly.

Ethical considerations

The purpose of the study was explained to the postgraduate critical care nursing students, who agreed to participate before the study commenced. Ethical approval for this study was obtained from the Fakeeh College for Medical Sciences Institutional Review Board (Approval No. 288/IRB), and all methods were performed in accordance with the Declaration of Helsinki. Written informed consent was obtained from all participants.

Results

The participants comprised 9 men and 21 women, with an age range of 27–45 years. Each had a Bachelor of Science in Nursing degree with a license to practice nursing in Saudi Arabia. Sixteen participants were working in government hospitals, whereas the rest were working in private hospitals in Saudi Arabia as staff nurses, nurse educators, and nurse managers in the critical care unit. They each had more than 3 years of experience in critical care and comprised 17 and 13 Saudis and non-Saudi nationals, respectively.

Table 2 shows the comparison of participants’ acquired knowledge through the post-test scores from the two phases of simulation sessions (Phase 1: Sim1, Sim2, and Sim3; Phase 2: Sim4, Sim5, and Sim6) from the two groups of postgraduate critical care nursing students: Group A who experienced HFS utilising the three-phase debriefing structure (GAS model) and Group B who experienced HFS utilising the multiphase debriefing structure (Healthcare Simulation AAR framework) during Phase 1; and Group A who experienced HFS utilising the multiphase debriefing structure (Healthcare Simulation AAR framework) and Group B who experienced HFS using the three-phase debriefing structure (GAS model) during Phase 2.

Table 2 Comparison of participants’ acquired knowledge through the post-test scores

In Phase 1, the mean of the post-test scores of Group A in the three-simulation session was 79.83%, with a standard deviation (SD) of 1.37%, which is significantly lower than the post-test scores of Group B, with a mean of 90.83% and an SD of 1.66% (a mean difference of 11.00%, supported by a p-value of 0.001). However, in Phase 2, the mean of the post-test scores of Group A in the three-simulation session was 90.75%, with an SD of 0.75%, which is significantly higher than the post-test scores of Group B, with a mean of 81.1% and an SD of 0.75% (a mean difference of -9.58%, supported by a p-value of 0.000) (see Table 2).

Table 3 shows the comparison of the skills developed by participants through the post-OSCE scores from the two phases of simulation sessions (Phase 1: Sim1, Sim2, and Sim3; Phase 2: Sim4, Sim5, and Sim6) from the two groups of postgraduate critical care nursing students: Group A who experienced HFS utilising the three-phase debriefing structure (GAS model) and Group B who experienced HFS utilising the multiphase debriefing structure (Healthcare Simulation AAR framework) during Phase 1; and Group A who experienced HFS utilising the multiphase debriefing structure (Healthcare Simulation AAR framework) and group B who experienced HFS using the three-phase debriefing structure (GAS model) during Phase 2.

Table 3 Comparison of the skills developed by participants through the post-OSCE

In Phase 1, the mean of the post-OSCE scores of Group A in the three-simulation session was 87.39%, with an SD of 2.03%, which is significantly lower than the post-OSCE scores of Group B, with a mean of 96.91% and an SD of 1.36% (a mean difference of 9.51%, supported by a p-value of 0.002). However, in Phase 2, the mean of the post-OSCE scores of Group A in the three-simulation session was 97.70%, with an SD of 0.95%, which is significantly higher than the post-OSCE scores of Group B, with a mean of 88.53% and an SD of 1.60% (a mean difference of -9.17%, supported by a p-value of 0.002) (see Table 3).

Discussion

This study utilised a quasi-experimental research design during the post-tests and OSCEs of a group of postgraduate critical care nursing students exposed to HFS with three-phase and multiphase debriefing structures. The results indicate that the knowledge acquired by the students, as observed in their post-test scores and the skills they developed from the OSCEs scores after exposure to HFS with the multiphase debriefing structure (specifically, healthcare simulation AAR), provided a great advantage. The use of the multiphase debriefing structure showed better results in terms of students’ acquired knowledge and skills compared with the three-phase debriefing structure owing to its comprehensive and detailed sequential approach. In multiphase debriefing, the conversational structures were expanded with its seven sequential steps that allowed specific emphasis on key themes and supported the debriefing conversation. According to Piaget, individuals generate knowledge through the interaction between experiences and ideas [10]. Piaget’s view of constructivism offers inspiration for fundamental constructivism because he believed that the individual is at the centre of the knowledge construction and acquisition process. Furthermore, an AAR delivers a means to witness and review actions commenced in response to an actual situation or scenario [39] and provides an incomparable opportunity for reflection and collective learning [40].

It is common knowledge that experience alone is insufficient in the learning process. This is why the use of HFS is insufficient in nursing education, and the integration of self-reflection is vital. Ideally, educators or simulators who use and perform HFS should possess adequate knowledge and have undergone sufficient training to guide students through their reflective learning process [31, 32]. The goal of reflection is to tap into the consciousness of learners and contemplate their given actions; with this perspective, they can fully understand their pre-existing knowledge and acquire additional information, abilities, and attitudes [17, 41,42,43].

With HFS training being a pivotal process in nursing education, students were able to apply their proficiencies in an imitated setting of what can occur during an actual clinical scenario. The HFS training and multiphase debriefing structure heightened and challenged the students’ clinical reasoning, decision-making skills, and judgements. This led to a higher chance of gaining new knowledge and its transferability to actual nursing care within the healthcare facility. Essential to the process of simulation is the method of debriefing, where students were encouraged to examine and assess the ‘know-what’, ‘know-how’, and ‘know-why’ during their HFS training.

Simulation is a valuable strategy in teaching that promotes a higher and more diverse level of learning through the evaluation of clinical skills for nursing and midwifery education, including undergraduate, postgraduate, and lifelong education [44, 45]. Moreover, the fashioned active pedagogical strategy of simulation training is beneficial for the consolidation and knowledge of students. It promotes progress in technical and relational skills and pushes students to construct rules and habits of critical thinking and reflection, which is a necessary mindset for competent professionals [44].

One of the key concepts of Piaget’s Constructivist Theory underlines that learning is a cognitive process; learners develop thinking through trial and error, forming knowledge through experiences [46, 47]. Additionally, in the process of learning, the teacher or facilitator guides the integration of new knowledge into its existing framework. The simulation offers this type of constructivism by ensuring that students can transfer their formed knowledge and skills based on experience into actual clinical settings. Moreover, simulation fosters collaboration among the students and facilitators [47, 48]. This is strengthened and justified by the debriefing process, in which reflection is conducted to identify and create a particular and necessary change to enhance subsequent action [47, 49, 50].

Through simulation, students acquired the possibility of increasing their knowledge and skills by applying previous learning to gain new proficiency; their studied theoretical notions will be put into direct practice. Finally, through debriefing, students were able to reflect and assess the actions they performed during the simulation to come up with better and more appropriate action plans for application in real clinical settings [47, 49, 50]. Simulation is a vital educational tool for nursing students to improve their critical thinking and enhance their clinical reasoning in complex care situations [47, 51].

Limitations

This study had a small sample size and involved only one setting and location due to the coronavirus disease 2019 restrictions regarding face-to-face interactions. The use of multiphase debriefing is preferred by the simulation facilitators considering their level of experience. Thus, to determine the generalisability of the study findings, additional studies with a larger sample in different settings are recommended. Furthermore, pre-test and pre-OSCE were not conducted since it was the first time the students were exposed to HFS and debriefing. Additionally, the content validity index of post-tests and post-OSCE rubrics were not calculated.

Conclusion

In conclusion, debriefing is a critical component of successful simulations for nursing students and practitioners. The participants’ reflection on their actions during the simulation training offers them a safe environment to determine mistakes they may have made, think of more suitable actions in response to this, change their behaviours, and collaborate with their facilitators to improve their performance. Learning requires assessment, which creates constructive criticism centred on reflection based on their action. Moreover, the use of a multiphase debriefing structure, specifically the healthcare simulation AAR, provides a great advantage for knowledge and skills acquisition.

Recommendations

The study recommends a reassessment of the same participants after a period of 3 to 6 months to identify transferability and learning retention. Further experimental studies on many participants regarding debriefing timing, facilitation, and testing effectiveness of the currently available structures of debriefing must be conducted. It is necessary to conduct intensive training programmes for nursing faculty members when facilitating debriefing to apply the best practices and enable reflective discussions after performing simulation scenarios. Creating a safe environment for the learners is conducive to conducting these debriefing sessions.

Availability of data and materials

The datasets used and analysed during the current study are available from the corresponding author upon reasonable request.

Abbreviations

HFS:

High-fidelity simulation

OSCEs:

Objective structured clinical examinations

AAR:

After-action review

GAS:

Gather, analyse, summarise

CSSC:

Clinical Skills and Simulation Center

MCQs:

Multiple-choice questions

SCE:

Simulation clinical scenario

SE:

Standard error

SD:

Standard deviation

References

  1. Gaba DM. The future vision of simulation in health care. BMJ Qual Saf. 2004;13(Suppl 1):i2–10. https://doi.org/10.1136/qhc.13.suppl_1.i2.

    Article  Google Scholar 

  2. Lundquist LL, Bilich LA, Jackson SC, Stevens KV, Tipton EJ. Measurable reflection in simulation: a pilot study. J Dent Educ. 2021;85:606–14. https://doi.org/10.1002/jdd.12506.

    Article  PubMed  Google Scholar 

  3. Fey MK, Jenkins LS. Debriefing practices in nursing education programs: results from a national study. Nurs Educ Perspect. 2015;36:361–6. https://doi.org/10.5480/14-1520.

    Article  PubMed  Google Scholar 

  4. Dufrene C, Young A. Successful debriefing — Best methods to achieve positive learning outcomes: a literature review. Nurse Educ Today. 2014;34:372–6. https://doi.org/10.1016/j.nedt.2013.06.026.

    Article  PubMed  Google Scholar 

  5. Levett-Jones T, Lapkin S. The effectiveness of debriefing in simulation-based learning for health professionals: a systematic review. JBI Libr Syst Rev. 2012;10:3295–337. https://doi.org/10.11124/jbisrir-2012-20.

    Article  PubMed  Google Scholar 

  6. Decker SI, Fey MK, Sideras SA, Caballero S, Rockstraw LJ, Boese T, et al. Standards of best practice: Simulation standard VI: the debriefing process. Clin Simul Nurs. 2013;9:26-9. https://doi.org/10.1016/j.nedt.2013.06.026.

    Article  Google Scholar 

  7. Dreifuerst KT. Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. J Nurs Educ. 2012. https://doi.org/10.3928/01484834-20120409-02. 9;51:326 – 33.

    Article  PubMed  Google Scholar 

  8. Jaeger KR. Simulation enhancement of clinical reasoning skills in undergraduate nursing programs: faculty perspectives 2012. (3536683 Ph.D.). Ann Arbor: University of Idaho (2012). http://0-search.proquest. com.alpha2.latrobe.edu.au/docview/1318896191?accountid¼12001.

  9. Mariani B, Cantrell MA, Meakim C, Prieto P, Dreifuerst KT. Structured debriefing and students’ clinical judgment abilities in simulation. Clin Simul Nurs. 2013;9:e147-55. https://doi.org/10.1016/j.ecns.2011.11.009.

    Article  Google Scholar 

  10. Paige JT, Arora S, Fernandez G, Seymour N. Debriefing 101: training faculty to promote learning in simulation-based training. Am J Surg. 2015;209:126–31. https://doi.org/10.1016/j.amjsurg.2014.05.034.

    Article  PubMed  Google Scholar 

  11. Ryoo EN, Ha EH. The importance of debriefing in simulation-based learning: comparison between debriefing and no debriefing. Comput Inf Nurs. 2015;33:538–45. https://doi.org/10.1097/CIN.0000000000000194.

    Article  Google Scholar 

  12. Tanoubi I, Labben I, Guédira S, Drolet P, Perron R, Robitaille A, et al. The impact of a high fidelity simulation-based debriefing course on the Debriefing Assessment for Simulation in Healthcare (DASH)© score of novice instructors. J Adv Med Educ Prof. 2019;7:159–64. https://doi.org/10.30476/jamp.2019.74583.0.

    Article  PubMed  Google Scholar 

  13. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27:10–28. https://doi.org/10.1080/01421590500046924.

    Article  PubMed  Google Scholar 

  14. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63. https://doi.org/10.1111/j.1365-2923.2009.03547.x.

    Article  PubMed  Google Scholar 

  15. Timmis C, Speirs K. Student perspectives on post-simulation debriefing. Clin Teach. 2015;12:418–22. https://doi.org/10.1111/tct.12369.

    Article  PubMed  Google Scholar 

  16. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc. 2007;2:115–25. https://doi.org/10.1097/SIH.0b013e3180315539.

    Article  PubMed  Google Scholar 

  17. INACSL Standards Committee. INACSL standards of best practice: SimulationSM debriefing. Clin Simul Nurs. 2016;12:21-5. https://doi.org/10.1016/j.ecns.2016.09.008.

    Article  Google Scholar 

  18. Kolbe M, Grande B, Spahn DR. Briefing and debriefing during simulation-based training and beyond: content, structure, attitude and setting. Best Pract Res Clin Anaesthesiol. 2015;29:87–96.

    Article  PubMed  Google Scholar 

  19. Alhaj Ali A, Miller ET, Ballman K, Bakas T, Geis G, Ying J. The impact of debriefing modalities on nurse practitioner students’ knowledge and leadership skills in managing fatal dysrhythmias: a pilot study. Nurse Educ Pract. 2020;42:102687. https://doi.org/10.1016/j.nepr.2019.102687.

    Article  PubMed  Google Scholar 

  20. Beauchesne MA, Douglas B. Simulation: enhancing pediatric, advanced, practice nursing education. Newborn Infant Nurs Rev. 2011;11:28–34. https://doi.org/10.1016/j.bpa.2015.01.002.

    Article  Google Scholar 

  21. Haut C, Fey MK, Akintade B, Klepper M. Using high-fidelity simulation to teach acute care pediatric nurse practitioner students. J Nurse Pract. 2014;10:e87–91.

    Article  Google Scholar 

  22. Rutherford-Hemming T, Nye C, Coram C. Using simulation for clinical practice hours in nurse practitioner education in the United States: a systematic review. Nurse Educ Today. 2016;37:128–35. https://doi.org/10.1016/j.nedt.2015.11.006.

    Article  PubMed  Google Scholar 

  23. Mulvogue J, Ryan C, Cesare P. Nurse simulation facilitator experiences learning open dialogue techniques to encourage self-reflection in debriefing. Nurse Educ Today. 2019;79:142–6. https://doi.org/10.1016/j.nedt.2019.05.021.

    Article  PubMed  Google Scholar 

  24. Warren JN, Luctkar-Flude M, Godfrey C, Lukewich J. A systematic review of the effectiveness of simulation-based education on satisfaction and learning outcomes in nurse practitioner programs. Nurse Educ Today. 2016;46:99–108. https://doi.org/10.1016/j.nedt.2016.08.023.

    Article  PubMed  Google Scholar 

  25. Ha EH. Effects of peer-led debriefing using simulation with case-based learning: written vs. observed debriefing. Nurse Educ Today. 2020;84:104249. https://doi.org/10.1016/j.nedt.2019.104249.

    Article  PubMed  Google Scholar 

  26. Kang K, Yu M. Comparison of student self-debriefing versus instructor debriefing in nursing simulation: a quasi-experimental study. Nurse Educ Today. 2018;65:67–73. https://doi.org/10.1016/j.nedt.2018.02.030.

    Article  PubMed  Google Scholar 

  27. Mulli J, Nowell L, Lind C. Reflection-in-action during high-fidelity simulation: a concept analysis. Nurse Educ Today. 2021;97:104709. https://doi.org/10.1016/j.nedt.2020.104709.

    Article  PubMed  Google Scholar 

  28. Zhang H, Wang W, Goh SHL, Wu XV, Mörelius E. The impact of a three-phase video-assisted debriefing on nursing students’ debriefing experiences, perceived stress and facilitators’ practices: a mixed methods study. Nurse Educ Today. 2020;90:104460. https://doi.org/10.1016/j.nedt.2020.104460.

    Article  CAS  PubMed  Google Scholar 

  29. Park S, Hur HK, Chung CW. Learning effects of virtual versus high-fidelity simulations in nursing students: a crossover comparison. BMC Nurs. 2022;21:100. https://doi.org/10.1186/s12912-022-00878-2.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Phrampus P, O’Donnell J. Debriefing using a structured and supported approach. In: Levine A, DeMaria S, Schwartz A, Sim A, editors. The comprehensive textbook of healthcare simulation. 1st ed. New York: Springer; 2013. pp. 73–84.

    Chapter  Google Scholar 

  31. Cheng A, Rodgers DL, van der Jagt É, Eppich W, O’Donnell J. Evolution of the Pediatric Advanced Life Support Course: enhanced learning with a new debriefing tool and web-based module for Pediatric Advanced Life Support instructors. Pediatr Crit Care Med. 2012;13:589–95. https://doi.org/10.1097/PCC.0b013e3182417709.

    Article  PubMed  Google Scholar 

  32. Sawyer T, Eppich W, Brett-Fleegler M, Grant V, Cheng A. More than one way to debrief: a critical review of healthcare simulation debriefing methods. Simul Healthc. 2016;11:209–17. https://doi.org/10.1097/SIH.0000000000000148.

    Article  PubMed  Google Scholar 

  33. Chameides L, Samson RA, Schexnayder SM, Hazinski MF, editors. Pediatric Advanced Life Support Provider Manual. Dallas: American Heart Association; 2011.

    Google Scholar 

  34. Sawyer TL, Deering S. Adaptation of the US Army’s after-action review for simulation debriefing in healthcare. Simul Healthc. 2013;8:388–97. https://doi.org/10.1097/SIH.0b013e31829ac85c.

    Article  PubMed  Google Scholar 

  35. Kolb DA, Fry R. Toward an applied theory of experiential learning. In: Cooper C, editor. Theories of group processes. London: Wiley; 1975. p. 33–57.

    Google Scholar 

  36. Taber KS. The use of Cronbach’s alpha when developing and reporting research instruments in science education. Res Sci Educ. 2018;48:1273–96.

    Article  Google Scholar 

  37. Cohen J. A coefficient of agreement for nominal scales. Educ Psychol Meas. 1960;20:37–46.

    Article  Google Scholar 

  38. McHugh ML. Interrater reliability: the Kappa statistic. Biochem Med. 2012;22:276–82.

    Article  Google Scholar 

  39. World Health Organization. International Health Regulations (2005): IHR monitoring and evaluation framework. Geneva: World Health Organization; 2018.

    Google Scholar 

  40. World Health Organization. Guidance for after action review (AAR). Geneva: World Health Organization; 2019.

    Google Scholar 

  41. Dismukes RK, Gaba DM, Howard SK. So many roads: facilitated debriefing in healthcare. Simul Healthc. 2006;1:23–5. https://doi.org/10.1097/01266021-200600110-00001.

    Article  PubMed  Google Scholar 

  42. Rodgers C. Defining reflection: another look at John Dewey and reflective thinking. Teach Coll Rec. 2002;104:842–66.

    Article  Google Scholar 

  43. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006;1:49–55. https://doi.org/10.1097/01266021-200600110-00006.

    Article  PubMed  Google Scholar 

  44. Martins JCA. Learning and development in simulated practice environments. Rev Enferm Refer. 2017;4(12):155–62. https://doi.org/10.12707/RIV16074.

    Article  Google Scholar 

  45. Park KO, Seo KW, Jeon YG, Song YS. Integrative review for simulation-based learning research in nursing education. J Korea Academy Simul Nurs. 2015;4(1):41–58.

    Google Scholar 

  46. Hmelo-Silver CE, Marathe S, Liu L. Fish swim, rocks sit, and lungs breathe: expert-novice understanding of complex systems. J Learn Sci. 2007;16:307–31.

    Article  Google Scholar 

  47. World Health Organization. Simulation in nursing and midwifery education. Regional Office for Europe; 2018. https://www.euro.who.int/__data/assets/pdf_file/0011/383807/snme-report-eng.pdf?ua=1. Accessed 1 Oct 2018.

  48. Jonassen DH. Thinking technology: Toward a constructivist design model. Educ Tech. 1994;34:34–7.

    Google Scholar 

  49. Baptista RCN, Martins JCA, Pereira MFCR, Mazzo A. Students’ satisfaction with simulated clinical experiences: validation of an assessment scale. Rev Lat Am Enfermagem. 2014;22:709–15. https://doi.org/10.1590/0104-1169.3295.2471.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Weaver A. High-fidelity patient simulation in nursing education: an integrative review. Nurs Educ Perspect. 2011;32:37–40. https://doi.org/10.5480/1536-5026-32.1.37.

    Article  PubMed  Google Scholar 

  51. Bagnasco A, Pagnucci N, Tolotti A, Rosa F, Torre G, Sasso L. The role of simulation in developing communication and gestural skills in medical students. BMC Med Educ. 2014;14:106. https://doi.org/10.1186/1472-6920-14-106.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We extend our gratitude to the Fakeeh College for Medical Sciences Nursing Department and faculty members involved in the MSc Critical Care Nursing Program and the facilitators at the Clinical Skills and Simulation Center.

Authors’ information

Jefferson Garcia Guerrero has a Bachelor of Science in Nursing (BSN) with a postgraduate degree of Doctor in Nursing Science (DNS) and a Doctor of Philosophy in Nursing (Ph.D.) and currently works at Fakeeh College for Medical Sciences (FCMS) in Jeddah, Saudi Arabia as an Assistant Professor and Director of the MSc Critical Care Nursing program.

Grace Medalyn Castro has a Bachelor of Science in Nursing (BSN) with a postgraduate degree of Doctor in Educational Management (DEM) and currently works at Fakeeh College for Medical Sciences (FCMS) in Jeddah, Saudi Arabia, as an Assistant Professor.

Minerva Pingue-Raguini has a Bachelor of Science in Nursing (BSN) with a postgraduate degree of Doctor of Philosophy in Educational Management (Ph.D.) and currently works at Fakeeh College for Medical Sciences (FCMS) in Jeddah, Saudi Arabia, as an Assistant Professor.

Funding

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Author information

Authors and Affiliations

Authors

Contributions

The authors confirm their contributions to the manuscript as follows: study conception and design: J.G.G.; data collection: G.M.C. and M.P.R.; analysis and interpretation of results: J.G.G, G.M.C. and M.P.R.; draft manuscript preparation: J.G.G. All authors reviewed the results and approved the final version of the manuscript.

Corresponding author

Correspondence to Jefferson Garcia Guerrero.

Ethics declarations

Ethics approval and consent to participate

Ethical approval for this study was obtained from Fakeeh College for Medical Sciences Institutional Review Board (Approval No. 288/IRB), and all methods were performed in accordance with the Declaration of Helsinki. Written informed consent was obtained from all participants.

Competing interests

The authors declare no conflict of interest.

Consent for publication

Not applicable.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Guerrero, J.G., Tungpalan-Castro, G.M. & Pingue-Raguini, M. Impact of simulation debriefing structure on knowledge and skill acquisition for postgraduate critical care nursing students: three-phase vs. multiphase. BMC Nurs 21, 318 (2022). https://doi.org/10.1186/s12912-022-01100-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12912-022-01100-z

Keywords

  • High fidelity simulation
  • Debriefing structure
  • Constructive criticism
  • Knowledge acquisition
  • Skills development
  • Clinical competence