Skip to main content

Effectiveness of applying clinical simulation scenarios and integrating information technology in medical-surgical nursing and critical nursing courses

Abstract

Background

To determine the impact of combining clinical simulation scenario training and Information Technology Integrated Instruction (ITII) on the teaching of nursing skills.

Methods

120 4th-year students in a nursing program who were enrolled in medical and surgical nursing courses. 61 received innovative instruction (experimental group) and 59 received conventional instruction (control group). The ADDIE model, systematic method of course development that includes analysis, design, development, implementation, and evaluation,was used to build simulation teaching and clinical scenarios and to create and modify objective structure clinical examination (OSCE) scenario checklists for acute myocardial infarction (AMI) care, basic life support and operation of automated external defibrillator (BLS), and subdural hemorrhage (SDH) care. The modified OSCE checklists were assessed for reliability, consistency, and validity. The innovative training included flipped classrooms, clinical simulation scenarios, ITII and blended learning formats.

Results

The reliability and validity of the OSCE checklists developed in this study were acceptable and comparable or higher than checklists in past studies and could be utilized as an OSCE performance tool. Students in innovative instruction obtained significantly better OSCE performance, lab scores and improvements from the previous year’s grades. Significant differences were found in situational awareness (SA). No strong correlations were found between OSCE scores and clinical internship scores, and no significant differences were found between the groups in overall clinical internship performance.

Conclusions

Innovative instruction showed better performance than conventional methods in summative evaluation of knowledge components, OSCE formative evaluation and clinical nursing internship scores, as well as improved situational awareness in nursing students.

Peer Review reports

Introduction

Nurses play a crucial role in the healthcare industry, offering comprehensive patient care and establishing close relationships with patients. High-quality clinical care is dependent on thorough training and assessment methods. Proper nursing education is essential. Objective structure clinical examinations (OSCE) were designed by Harden et al. in 1975 to evaluate the clinical competency and skills of graduating medical students and have since been adapted for nursing students as well. In OSCEs, students spend 5 min each at a series of test stations, where they are observed as they interact with clinical environments with either real patients or actors. OSCEs extensively test clinical skills, stimulate learning, and have been found to be less biased and have better reliability, validity, and objectivity than other evaluation methods [1,2,3,4,5]. Studies have shown that combining simulation-based education and OSCEs can reinforce the objectivity of clinical competency evaluations [6, 7]. Regardless, OSCEs have been verified as an effective tool and presented as an outcome-based test for evaluating students and their clinical performances after graduation [8]. In nursing education, OSCEs have become increasingly integrated with multiple disciplines and are used both as formative and summative evaluations. Therefore, this study utilized OSCE as a tool to evaluate student performance.

However, although OSCEs provide a way to evaluate and improve the transfer of classroom and lab learning into simulated clinical scenarios, some argue that it is inappropriate to assume that such simulation performance translates into real-world competence [7, 8]. This might be due to the uncertain and unpredictable nature of the care process itself. Therefore, integrating skills such as perception, comprehension and projection into OSCEs can further fortify the quality of care. Situational awareness (SA) refers to how people pay attention to surrounding events, the information they notice, and their use of that information to form plans or decisions [9, 10]. The three SA stages of cognitive performance are perception (SA1), comprehension (SA2), and projection (SA3). In SA1, people use sensory input to understand events occurring in the surrounding environment. In SA2, perceived information is processed, identifying relationships between clues and task goals, and fully understanding the situation. In SA3, predictions of future results are formed based on SA1 and SA2. The formulation of actions and strategies required in the future, which is based on the three stages, is the highest form of SA [9, 11,12,13]. These three stages are dynamic, and the appropriateness of a person’s current SA and the quality of their decision-making and performance depend on the task at hand and the individual and system factors involved, such as knowledge, perception, task goals, level of understanding, interpretation of messages, attention, and memory [14]. Nursing care is also a dynamic process, therefore nurses should be trained towards highest levels of SA to ensure the highest-quality patient outcomes and safety.

Student evaluations to assess evidence of learning are an essential part of systemic approaches to teaching [15]. Summative and formative evaluations reflect the realities, operational difficulties, and challenges of the teaching process to support learning in the future [16]. With the rapid development of the Internet, information skills have become key to students’ competitiveness. Online education – whether in the form of massive open online courses (MOOCs), small private online courses (SPOCs), interactive educational software, simulation devices, teaching platforms, sharing platforms, or otherwise – has subverted conventional teaching methods. As a result, continuous change and innovations such as flipped classrooms and blended teaching continue to challenge educational institutions at every level, as well instructors and students [17,18,19,20]. Wang and Zhu [21] compared MOOC-based flipped learning against conventional instruction in an inorganic chemistry course and found students in the former performing better than those in the latter; furthermore, students in the former found the MOOC-based curriculum helpful with the knowledge component of the course, but not in the practicum and feedback. Li, Zhang and Hu [22] explored flipped learning on Moodle on-line platform and found increased student autonomy, motivation and content knowledge. The ways in which technology can be effectively incorporated into nursing education merits research and discussion.

Rather than one-way lecture in conventional nursing education, innovative instruction integrates technological products into teaching to promote interaction between instructors and students in order to boost student motivation and performance. The purpose of this study is to integrate applications and products such as SPOCs, Zuvio, advanced simulator, smart Little Anne QCPR manikin, AED, Moodle, LINE into nursing education strategies for internal medicine, surgical nursing, and critical care nursing courses for fourth-year students in 5-year Taiwanese junior college programs and to attempt to understand whether innovative instruction (in the form of information technology integrated instruction, ITII) made a difference on the effectiveness of OSCE, or has any effect on SA. The conceptual construct for this study is shown in Fig. 1. In summary, the purpose of this study is as follows:

  1. 1.

    To build clinical scenarios for medical and surgical nursing courses to be combined with OSCE, and to test the validity and reliability of the resulting OSCE checklist.

  2. 2.

    To investigate the correlations between the medical and surgical clinical internship scores and OSCE scores in innovative and conventional instruction

  3. 3.

    To investigate the differences in OSCE by innovative and conventional instruction.

  4. 4.

    To explore the differences in situational awareness between innovative and conventional instruction.

  5. 5.

    To explore how innovative and conventional instruction affect knowledge-based summative evaluations.

Fig. 1
figure 1

Conceptual Construct Diagram

Material and methods

Participants

This study was approved by the Institutional Review Board of St. Martin De Porres Hospital (IRB No.18B-12). Using purposive sampling, this study recruited fourth-year students in a 5-year program at a medical and nursing management junior college in central Taiwan. Prior to the first semester of the 2019 academic year, fourth-year students who started in the program in 2016 were identified. Student recruits were informed in detail of the purpose and procedures of the study, as well as any potential risks and interests, and were given ample time to consider before signing consent. A total of 120 students were enrolled in the study, aged between 19 and 20 years, five were males and the remaining were females. Consent from legal guardians were obtained for participants under 20 years. All participants were given the right to terminate participation in the study at any time without any conditions.

Study design

In order to compare student performance between conventional and innovative instruction, purposive sampling was used to divide the participants into experimental and control groups. Course contents of the two groups were similar, only the method of instruction differed – innovative instruction for the experimental group, and conventional for the control group. Within-group factors were simulated scenarios and levels of situational awareness. The formative evaluations for the three clinical simulation scenarios (AMI, BLS, and SDH) were performed through revised OSCEs, andSA factors were added to simulated scenarios to explore differences in SA levels under different scenarios. At the end of the semester, all students were evaluated on their summative performance in medical and surgical nursing, medical and surgical nursing labs, and medical and surgical nursing internships as well as OSCE performance under different simulated scenarios and situation awareness by the instructor of each class. Data were collected at the end of the semester through the student grade performance system.

Research tools

Medical and surgical nursing courses

Medical and surgical nursing is a core competency course as well as a major subject in Taiwan’s national nursing examination. Students are taught to apply basic biomedical and scientific knowledge; assess and analyze patients’ physical, mental, spiritual, and sociocultural responses; provide suitable nursing measures; and accurately implement and apply relevant techniques to clinical practice. The experimental group received innovative instruction, and the control group received conventional instruction, as defined in Table 1.

Table 1 Experimental Group vs Control Group Syllabi

Course development and design of clinical simulation scenario lesson plans

Course integration and development of clinical scenario lesson plans were accomplished in two stages, as per Thomas et al. [23] Clinical simulation scenario learning was then added, along with explorations based on the outcomes of ITII.

Stage 1: integrating the medical–surgical and critical nursing courses

The medical and surgical nursing course development team was comprised of eight nursing instructors, three clinical nursing experts, and one clinical physician. They integrated the course syllabus, purpose, teaching strategies, teaching progress and content, performance evaluations, textbooks, and reference books of teaching courses originally aimed at fourth-year students of a 5-year continuous nursing program. The course development team also incorporated scenario simulations with ITII as a teaching strategy and established a critical nursing course using flipped classroom concepts and a revised OSCE.

Stage 2: developing scenario simulation teaching and clinical scenario teaching templates with the ADDIE model

1. Developing scenario simulation teaching and clinical scenario teaching templates

Appropriate clinical teaching templates were developed referencing the Human Patient Simulation Scenario Development Patient Case Template (HPSSDPCT) [24, 25] and the Template of Events for Applied and Critical Healthcare Simulation (TEACH Sim) [24] and using the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) [26,27,28] teaching design model, while also introducing scenario simulation teaching into the lesson plans. First, in the Analysis phase, researchers analyzed the students, the curriculum, training tools and the learning environment. Then, based on results of the analyses, the Design phase employed SMART (Specific, Measurable, Achievable, Relevant, Timely) principles to formulate course syllabus, content set-up and objectives. Then in the Development phase, experts in clinical nursing and medicine would discuss over course content such as presentation, activities, interface design and course feedback, generating an instructor’s and a student’s manual. During the Implementation phase, plans from the previous phase were executed with programming design, script writing for simulation scenarios, and visual design. Lastly is Evaluation, to assess the effectiveness of course content and interface from the previous phases. Three KSA domains—knowledge (cognitive), skills (psychomotor), and attitude (affective)—were used to set learning priorities, key events, and target responses.

2. Designing the clinical scenario simulation lesson plans

The medical and surgical nursing course development team formulated clinical scenario lesson plans based on five topics: percutaneous transluminal coronary angioplasty (PTCA) and stent placement care for patients with acute myocardial infarction (AMI); basic life support (BLS) and automated external defibrillator (AED) operation; subdural hemorrhage (SDH) care; applications of the advanced Apollo Simulator for patients with septic shock; and acute respiratory distress syndrome (ARDS) care.

Clinical nursing and medical experts in each medical field were invited to join the development team to discuss and revise the teaching scenarios. Next, two pilot tests were conducted, and the scenarios were revised again based on the test results to form the final version. Each class session was 30 to 90 min in length, depending on the topic scenario.

OSCE checklist development and reliability and validity tests

OSCE scores are a measurement tool that deconstruct observable tasks to evaluate student performance, and so all OSCE checklists must be reliable and valid. Therefore, after the first drafts of the OSCE checklists for the three scenario simulation lesson plans in this study were completed, they were tested for face validity, content validity, and criterion validity [2].

1. Creating the first draft of the checklists

OSCE checklists were created for PTCA care for patients with AMI, BLS and AED operation (henceforth “BLS), and SDH care. Both septic shock care and ARDS care were taught as part of the coursework but excluded from the OSCEs because they required the use of advanced simulators not available to the study team. The checklists itemized the required actions in the care procedure, and each checklist item was graded based on whether it was fully completed (2 points), partially completed (1 point), or not completed (0 points).

2. Performing expert content validity tests

The first drafts of the three clinical scenario checklists were reviewed by 17 clinical experts and three fifth-year nursing students for appropriateness, clarity, conciseness, and wording, as well as content validity. Using the content validity index (CVI), each item of the evaluation survey was assigned 1 to 4 points on a 4-point Likert scale, wherein 4 was “very appropriate,” 3 was “appropriate,” 2 was “inappropriate,” and 1 was “very inappropriate.” Each checklist could receive up to a 100 total points. Each checklist item was assessed by a subject-matter expert, and for every checklist, the percentage of items that were scored “appropriate” or “very appropriate items” was calculated. Checklists were deemed to have a favorable validity index if their score was at least 80% [29].

The checklist evaluations and expert opinions were then organized into a summary table. Items that received 3 or 4 points were retained; items that received only 1 or 2 points or were considered unclear or badly-worded were either revised or deleted after the course development team considered the expert commentary.

3. Reliability testing

The revised checklists developed for the three clinical scenario topics (AMI, BLS and AED, and SDH) were rated for inter-rater reliability using the Kendall coefficient of concordance. Six students were sampled for each topic, and the OSCEs were held in the clinical skills center and recorded on video. Four nursing instructors evaluated the performances according to the three revised checklists. The reliability of the three OSCE checklists were also evaluated using Cronbach’s α.

Determination of SA levels

Research has shown that SA influences decision-making and is affected by factors such as knowledge, perception, task goals, degree of understanding, message deciphering, level of attention, and memory [9]. SA items were added to this study because the unpredictable nature of the care environment and complexity of patient care demand high levels of situational awareness. Two clinical design experts and three human-factors engineering experts were invited to the course development team meeting to discuss and determine the necessary SA of each item; the determination of SA levels was reached through consensus, and they were integrated into the OSCE checklists. The levels were determined as follows:

  1. (1).

    SA1 (perception): review, confirmation, basic interpretation, the execution of basic skills, and patient identification.

  2. (2).

    SA2 (comprehension): the interpretation of information, further treatments following observation, decisions and treatments based on SA1, and reports.

  3. (3).

    SA3 (projection): advanced interpretations, continued treatment, proposals of precaution and health education, decisions and treatments based on SA2, and reports.

Application of ITII

Seven information technology tools were used in accordance with the course progress and clinical scenario topics:

1. EWANT (SPOCs)

Based on the medical and surgical nursing course outlines, videos were recorded in school MOOC classrooms, edited and then uploaded to the Ministry of Education’s EWANT online learning platform to create SPOCs. The instructor released 20 learning topics based on course progress, and after receiving their account and password, the experimental group students could log in to the platform and engage in self-learning before class. Statistical information regarding student engagement and duration was collected from EWANT.

2. Zuvio (interactive classrooms)

Zuvio is an e-learning platform that allows lesson preparation and extensive student-teacher interaction during class. The campus version of Zuvio was used in the clinical simulation courses. During class, the instructor presented course materials based on the scenario progress, which included videos, physician orders, electrocardiograms, and inspection reports. Students could watch the materials on their smartphones and engage in interactive activities in real-time, such as interpreting clinical data and answering questions. Zuvio was also used for post-class tests, reviews, and roll call.

3. MOODLE (teaching platform)

The Moodle teaching platform (version 3.2.3+, build: 20170512) was used for its administrative and management functions, such as instructor announcements, tracking teaching progress, file storage, and teaching evaluations.

4. LINE messaging app

All participants in the experimental group interacted with the instructors using the messaging app LINE, through which they also received messages, announcements, and reminders and participated in after-class discussions. Furthermore, all the clinical simulation scenario files were uploaded to the LINE group for students to download.

5. High-fidelity wireless simulator

The high-fidelity wireless simulator used in this study was the Apollo Patient Simulator (CAE Healthcare, Canada), which has built-in batteries and gas compressors. During scenario lessons, the simulator’s “monitored” physiological data was projected onto the classroom screen while the Apollo Simulator presented the corresponding clinical symptoms, allowing students to perform physical examinations, interpret the preset physiological data to uncover patient problems, and implement appropriate nursing measures. The scenario topics that used this setup included professional nursing skills and knowledge for monitoring and assessing vital signs; measurement of central venous pressure and catheter care; endotracheal tube care; sputum suctioning; oxygen therapy use and efficacy assessment; urinary catheter care; assessment of states of consciousness; nasogastric tube care; and feeding methods.

6. Little Anne QCPR

The “Little Anne QCPR” (Model number 123–01050, Laerdal) is an adult upper torso manikin that combines the detection and feedback on of high-quality CPR items such chest compression depth and speed, respiratory volume accuracy, etc. This smart teaching aid was used for teaching BLS skills, in the AED simulation scenarios, and in the revised OSCEs for testing student performance. When in use, the manikin was paired with a phone or tablet app to display and record operational results.

7. AED trainer 3

The AED Trainer 3 by Philips was used in BLS and AED simulation scenario teaching and the student OSCEs. This model is compliant with the American Heart Association first-aid guidelines. The trainer’s first-aid procedures were configured by the instructors, and the trainer was connected to a simulator.

Statistical analysis

The data were organized and analyzed using SPSS 23.0 (Armonk, NY: IBM Corp.). The statistical methods used included descriptive statistics, independent samples t tests (p < 0.05 was considered significant), Kendall coefficient of concordance, Cronbach’s α, and Pearson correlation coefficient analysis. Student performance was analyzed with descriptive statistics, including medical and surgical nursing, medical and surgical nursing labs, and medical and surgical nursing internships; independent samples t tests were used to compare differences in student performance between the experimental and control groups; Kendall coefficient of concordance was used to test inter-rater reliability; Cronbach’s α for reliability of the OSCE checklist; and Pearson correlation coefficient analysis was used to explore the correlation of the different groups to OSCE.

Results

A total of 120 participants were recruited into the study: 61 participants into the experimental group and 59 into the control group. In the control group, one student withdrew from school during the clinical internship, so there were only 58 sets of clinical internship grades (missing value = 1).

Reliability and validity analysis of the three OSCE checklists

The AMI OSCE checklist had a CVI of 0.981, and so four items were revised, one item was deleted, and 26 items were retained. The BLS and AED OSCE checklist had a CVI of 0.987, and so three items were revised and 25 items were retained. The SDH OSCE checklist had a CVI of 0.981, and so one item was revised and 13 items were retained.

The Kendall’s coefficient of concordance values of the three clinical scenarios are presented in Table 2 and indicated significant correlations among the nursing instructors’ evaluations, and therefore the inter-rater reliability and consistency of the checklists were deemed acceptable.

Table 2 Kendall Coefficient of Concordance

The Cronbach’s α values were 0.608 for the AMI checklist, 0.797 for the BLS checklist, and 0.761 for the SDH checklist, with the latter two achieving high reliability.

Learning outcomes of the clinical simulation scenarios and ITII

Medical and surgical nursing course grades before and after intervention

Since the study started in fourth year, third-year course grades could serve as the baseline for both groups before the start of the courses, whereas fourth-year grades as one of the outcomes of the intervention. Therefore, comparison between the two groups were made at baseline (Year 3) and after intervention (Year 4), with the independent samples t test results shown in Table 3. While the two groups were not significantly different at baseline (Year 3) [t(61.59) = 1.229, p = 0.222, Cohen’s d = 0.22], there was a significant difference between the two groups after the intervention (Year 4) [t(61.59) = 2.392, p = 0.018, Cohen’s d = 0.46].

Table 3 Independent Samples t Tests of Participants’ Third- and Fourth-Year Medical and Surgical Nursing Grades

df, degree of freedom.

Correlation between clinical simulation scenarios and internship OSCE results

Correlations between the medical and surgical clinical internship scores and OSCE results were determined through Pearson correlation coefficients. Only the experimental group showed any significant difference between their OSCE score and their medical and surgical clinical internship score for BLS [r(61) = 0.301, p = 0.018], which showed a low, positive correlation (Table 4).

Table 4 Medical and Surgical Clinical Internship OSCE Scores: Pearson Correlation Coefficients Between Innovative and Conventional Teaching Methods

Clinical simulation scenarios and ITII on medical and surgical nursing lab scores and internship scores

The medical and surgical nursing lab scores and clinical internship scores of the experimental and control groups were compared using independent samples t tests. For the lab scores, the results indicated that the experimental group’s average scores were significantly higher than the control group’s average by 3.46 points [t(61.58) = 1.944, p = 0.048, Cohens d = 0.36] (Table 5). For the clinical internship scores, the results showed no significant differences between the two groups, with the experimental group outperforming the control group by only 0.04 points (Table 5).

Table 5 Medical and Surgical Nursing Lab Scores and Clinical Internship Scores for Innovative and Conventional Teaching Methods

Influence of clinical simulation scenarios and ITII on OSCE formative evaluations

The total OSCE scores and AMI, BLS, and SDH scores of the experimental and control groups were compared using independent samples t tests. The results showed that the experimental group significantly outperformed the control in OSCE total scores [t(61.59) = 7.885, p < 0.01, Cohens d = 1.44], AMI scores [t(61.59) = 6.840, p < 0.01,Cohens d = 1.25], and SDH scores [t(61.59) = 6.469, p < 0.01, Cohens d = 1.18], the last by nearly 15 points (Table 6).

Table 6 Influence of Innovative and Conventional Teaching Methods on OSCE and the Three Topics

Clinical simulation scenarios and ITII on SA among the three OSCE topics

The SA1, SA2, and SA3 scores for the three OSCE topics were compared using independent samples t tests. The results showed significant differences in AMI SA2 [t(61.59) = 5.171, p < 0.01, Cohens d = 0.95] and SA3 [t(61.59) = 8.989, p < 0.01, Cohens d = 1.64]; in BLS SA1 [t(61.59) = − 2.215, p = 0.029, Cohens d = 0.40], SA2 [t(61.59) = − 2.146, p = 0.034, Cohens d = 0.39], and SA3 [t(61.59) = 8.982, p < 0.01, Cohens d = 1.63]; and in SDH SA1 [t(61.59) = 4.395, p < 0.01, Cohens d = 0.80] and SA2 [t(61.59) = 6.296, p < 0.01, Cohens d = 1.14]. Except for BLS SA1 and SA2, in which the control group scored higher, the experimental group outperformed the control group across the board, especially in BLS SA3 (Table 7).

Table 7 Innovative and Conventional Teaching Methods on OSCE SA

Discussion

Building clinical scenario lesson plans for medical and surgical nursing

In this study, course integration and scenario development were created by a medical and surgical nursing course development team, through nursing classes, as well as including a cross-professional team of clinical nurses and physicians. Combining the ADDIE model [26] with the HPSSDPCT and Benishek et al.’s TEACH Sim template [24] to construct the simulation scenario lesson plans was highly beneficial. In the construction process, reaching a team consensus on the guidelines, technical handbooks, textbooks, and clinical practices was challenging; industry–academia differences had to be reduced or addressed during this process. The scenarios had to be designed to resemble clinical practices as much as possible, and purposeful simulation designs were expected to effectively improve the structure, process, or results of the course goals and/or the institution. Furthermore, the scenario materials were based on actual case files from St. Martin De Porres Hospital and the Internet and were confirmed by the course development team and clinical specialist physicians. In accordance with Lioce et al.’s proposition, the simulation-based experiences were specifically designed to achieve confirmed goals [30]. The standardized simulation designs provided a framework for building effective simulation-based experiences and favorable evidence for adult learning fields, education, teaching design, clinical care standards, evaluations, and simulations.

Developing OSCE instruments and their reliability and validity

The OSCEs were developed through integrated scenario building and were based on McWilliam and Botwinski’s guidelines [31]: 1) case scenarios must be built by experts and instructors within the topic to test the effectiveness of the competency being evaluated; and 2) if more than one professional field is being tested, then experts in all of the relevant disciplines should work together while keeping current conditions in mind and adhering to clinical procedures. The development of the OSCE checklists was based on teaching goals and the research purpose but did not follow the Angoff scoring method. Instead, the percentage of each checklist item was converted into points, and in accordance with Harden et al. and Wessel et al., the checklists were then expanded to include more fields for addressing the shortcomings of binomial options [32, 33]. Furthermore, based on Fox et al.’s three-item scoring method, scores were assigned for fully completed (2 points), partially completed (1), and not completed at all (0) [34], with the tasks and number of actions to be completed were clearly defined. Moreover, after a consensus had been reached on the scenario outlines of each topic, the content validity was tested by consulting clinical specialist physicians, specialist nurses, hospital senior nurses, infection control nurses, supervisors, and four nursing students. Rushforth argued that evaluation standards must be developed for any OSCE [2]; in this study, face validity, content validity, and criterion validity were all tools for scoring construction, and the three OSCE checklists scored CVI of 0.981 to 0.987. Using Kendall’s W, the revised checklists showed significant inter-rater reliability among three sets of six unrepeated students and four examiners, this indicating high levels of both reliability and consistency. Furthermore, the Cronbach’s α values of the AMI, BLS, and SDH checklists’ reliability were between 0.608 and 0.797, with the BLS and SDH checklists reaching high reliability. The Cronbach’s α values in this study were all comparable to or slightly higher than other values found in the literature [34,35,36,37,38], and the internal rate of return was probably higher because multiple experts were involved in all stages; the lesson plan topics were focused; and the checklist items were objective, highly detailed, and rigorous. It should also be noted that the examiners were all also members of the course development team. Previous studies have had topics with broader scopes, which may have affected the process of designing the OSCE checklist, applied content, operations, and number of testing stations, producing the relatively lower levels of reliability and validity.

Implementing innovative teaching practices: clinical simulation scenarios and ITII

The scenario building process in this study referred to the work of Bambini, Lioce et al., and Harrington and Simon [25, 30, 39]. The scenarios were based on the writers’ own experience of patient conditions and were supplemented and fleshed out using accumulated academic knowledge to improve their clinical integrity. Bambini suggested adjusting the complexity of the simulation based on the learner’s level and that complicated scenarios should be separately built for more experienced and senior workers [39]. Scenario lesson plans for nursing students should consider the students’ academic abilities and the corresponding available data to approximate clinical practices and adjust complexity as needed. As a result, the developed OSCE checklists can achieve high reliability and validity.

A combination of high-fidelity simulation equipment, actors (a.k.a. standardized participants or “SPs”), were used in the simulation scenario lessons, as per Willhaus [40]. The venues used in the teaching process were equipped with standard ward equipment, patient units, simulated patients, basic medical facilities, tools, nursing carts, and emergency carts. The simulation lesson plans were ranged from simple to extremely complex, and the applications of the Apollo Simulator ranged from BLS to septic shock patient care. The simulation equipment varied greatly in function and was chosen based on the teaching goals and the desired outcomes of the scenario simulations, to ensure optimal operation and the adherence to key design criteria. As Childs and Sepples noted, as the construction and implementation of simulation labs require more time than conventional teaching methods do, under budget constraints, the number of advanced simulators – which are more expensive and complex – could be insufficient [41]. Therefore, the use of OSCEs was dropped from septic shock patient care in this study.

All the scenario materials were pre-built into the Zuvio interactive university classes, and students used their smartphones to watch the materials and answer questions according to the scenario. Because the scenario lessons consume a considerable amount of class time, SOPCs were introduced with flipped classroom formats and integrated into blended learning. Before the scenario lessons, students used their extracurricular time to prepare, which helped with the insufficient class times.

ITII covers the instructors’ teaching activities, students’ learning activities, teaching preparation, and classroom management [42]. Although studies have verified the educational advantages of incorporating technology solutions like ITII, nursing education still largely uses conventional methods. This study suggests that the instructors’ own technological literacy is one of the keys to promoting the use of technology in nursing education, which is consistent with the findings of Yeh et al. and Xu and Chen [43, 44]. How educational institutions cultivate instructors to possess relevant knowledge and competencies and maintain pace with technological development to adapt to new forms of education remains a critical issue.

SPOCs, rather than MOOCs, were used in this study, because the implementation of the latter carries some challenges (such as a high dropout rate) and also because the literature has already reported the advantages of replacing MOOCs with SPOCs [45,46,47,48,49,50,51]. Prior to beginning the course, students were required take the EWANT online preparatory course, which is a flipped classroom and adds practical courses for the simulated scenario lessons. The findings of this study indicate that SPOCs support the effectiveness of small-scale blended learning, allowing students to have more comprehensive and in-depth learning experiences while simultaneously providing instructors with flexible and feasible teaching models. Such models could help instructors understand students’ learning needs and behaviors using learning hours, achievement rates, and formative and summative evaluations.

The effectiveness of introducing innovative teaching

Influence of innovative teaching on OSCEs

The experimental group outperformed the control group in terms of average and individual-subject OSCE scores across the board, particularly in SDH, by nearly 15 points, and with significant differences between their total OSCE scores and their AMI and SDH subject scores. This suggests that interventional clinical simulation scenarios and ITII had favorable effects on OSCE formative evaluation scores. No significant difference was found between the two groups for BLS, possibly because this topic involves more basic skills than complex ones. Hu et al. [52] compared between flipped learning and conventional instruction for hyperthyroidism knowledge and care skills among medical interns and found no difference in the knowledge component, while there was a significant difference in clinical case analysis where students in flipped learning performed better than those in conventional instruction. Comparing third-year summative evaluation to fourth-year first-semester scores in medical and surgical nursing, moderate and low levels of correlation were exhibited only among the experiment group’s medical and surgical nursing lab and clinical internship scores and total scores on the revised OSCEs, whereas the control group showed no significant differences at all. This implies that the innovative instruction did make a difference on the total scores on the revised OSCEs, and also that introducing innovative teaching methods could lead to nursing students performing better on OSCE formative evaluations than would conventional ones. The revised OSCE was used in this study as the formative evaluation for subjects, not only to audit the students’ performance but also to improve them. This improvement was possibly achieved through providing feedback and adjusting in-progress teaching and learning, as well as by developing intervention measures, thus achieving effective learning [53,54,55].

Whether SA is affected by innovative and conventional teaching

Among the three topics, the experimental and control groups demonstrated significant differences in SA2 and SA3 in AMI; in SA1, SA2, and SA3 in BLS; and in SA1 and SA2 in SDH. Where the differences were significant, the experimental group scored higher than the control group, except for SA1 and SA2 in BLS, where the control group scored higher. This result implies that innovative instruction could help students develop significantly different levels of SA. Typically speaking, new nurses should have high levels of SA1 and SA2 in most clinical scenarios, as they are novice professionals and advanced learners, and will only develop a certain degree of SA3 after achieving professional competence.

However, the experimental group did not develop consistent SA differences in all three topics, which may be because of the different difficulty levels of each topic. Curl et al. [56] found different effects by integrated simulation on different topics in nursing education. In BLS, which is comparatively simpler than AMI and SDH, the two groups exhibited significant differences in all three SA levels, but the control group demonstrated slightly higher levels of SA1 and SA2 than the experimental group. Whether this was caused by a possible greater familiarity with BLS requires further discussion. As for the other two topics, AMI is moderately difficult, with the two groups demonstrating significant differences in SA2 and SA3, implying that SA1 was basic knowledge for nursing students; and SDH is an even more difficult topic, with significant differences between the two groups’ SA1 and SA2 only.

Whether innovative teaching and conventional teaching methods affect academic learning effectiveness

No significant differences were found in the third-year total scores between the experimental and control group, implying that both groups were at around the same level in their medical and surgical nursing summative evaluations, possibly due to all receiving the same conventional instruction. However, significant differences were observed in the total scores of the first semester of their fourth year, with the experimental group scoring higher by an average of 3.29 points. This suggests that the innovative instruction produced significant improvements in their academic grades. In lab grades, the experimental group significantly outperformed the control group by 3.46 points, implying that innovative instructions were superior to conventional ones for improving short-term lab performance during the one semester of intervention. Consistent with past studies [21, 22], innovative instruction was found to boost student motivation and performance in healthcare education. However, there were no significant differences between the groups’ clinical internship grades, which was attributed to the short time invested in the innovative teaching experiment (only one semester, a total of approx.. 188 in-class hours). Harrington et al. [57] compared student performance between flipped learning and conventional instruction by testing three times during the semester and found no difference among the groups, demonstrating that the differences were not easily detected within a short span of time, especially when the course content was mostly practical. Greater differences between methods may be produced if clinical simulation scenarios and ITII could be utilized for a longer period of time.

Conclusion and limitations

Conclusion

This study developed a simulated instruction system with clinical scenario template for nursing education based on the ADDIE model through interdisciplinary collaboration with nursing education, clinical nursing and medicine to make scenarios as close to real-life as possible, as well as OSCE checklists based on the aforementioned simulated clinical scenario through consensus after many discussions to ensure objectivity and attention to details. The AMI, BLS, SDH assessment had Cronbach’s α between 0.608–0.797 and CVI of 0.981 or above, demonstration the OSCE checklists to be reliable.

For student performance, there was no significant difference between the experimental and control groups before the course, demonstrating their similarities, but student performance for the knowledge component in the experimental group (innovative instruction) was better than the control group (conventional instruction) after the courses, demonstrating the effectiveness of innovative instruction to boost knowledge-based learning within a short amount of time. There was no significant difference in the practical component, possibly due to the more complex and varying nature of practical skills in which differences could not be detected within a short span of time.

In terms of OSCE performance, the experimental group performed better than the control group in overall OSCE, AMI and SDH, demonstrating the effectiveness of innovative instruction on boosting clinical care skills. However, since BLS content is more knowledge-based, there was no significant difference in OSCE between the two groups.

On the other hand, under different levels of situational awareness, student performance was related to task difficulty. Therefore in the highly-difficult SDH tasks, there was a difference between the experimental and control groups on the OSCE for SA1 and SA2, namely, the experimental group performed better in the perception and comprehension levels, but there was no significant difference in the projection level. In the medium-difficult AMI tasks, there was a significant difference between the groups in SA2 and SA3 of OSCE, showing that students in innovative instruction performed better in comprehension and projection for AMI tasks than those in conventional, but there was no difference in perception between the groups. For the easier BLS tasks, the experimental group performed better in SA3 but worse in SA1 and SA2 compared to the control group. This might suggest that conventional instruction was helpful with perception and comprehension in BLS tasks, but the projection level could be improved with innovative instruction.

Overall, innovative instruction could help boost performance in knowledge-based content and OSCE performance, and the different teaching methods affect different levels of situational awareness in practical tasks. This study could serve as a reference for future nursing education research, as well as recommendation for instructors of clinical nursing or medicine.

Limitations and future research

The participants in this study came from purposive sampling in a five-year junior college program, which was not the main nursing degree program in Taiwan. Future participants could be recruited from, and comparisons made with other types of nursing degree programs such as two-year technical programs and bachelor’s degree programs for broader representation and better objectivity. Clinical expertise in this study were all provided from medical specialists, clinical supervisors, head nurses, and nurse practitioners at a single institution (St. Martin De Porres Hospital). Future collaborations with other institutions from different levels of care could be explored to expand and optimize the design for OSCE checklists. The innovative teaching methods were conducted in a hybrid manner, making explorations of any single interventions unfeasible. Future studies could evaluate the effects of single elements in teaching methods.

Due to constraints in resources and time, the innovative teaching methods were applied for one single semester and only modest modifications were made to existing OSCE checklists. Extensive re-development of the entire medical-surgical nursing curriculum was recommended. With the proliferation of information technology-based applications in teaching methods for nursing education, more proficiency in more recent technologies, as well as inter-disciplinary communication are needed in the continuing education of instructors to optimize nursing education.

Availability of data and materials

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to legal restrictions imposed by the government of Taiwan in relation to the “Personal Information Protection Act”.

References

  1. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447–51. https://doi.org/10.1136/bmj.1.5955.447.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Rushforth HE. Objective structured clinical examination (OSCE): review of literature and implications for nursing education. Nurse Educ Today. 2007;27(5):481–90. https://doi.org/10.1016/j.nedt.2006.08.009.

    Article  PubMed  Google Scholar 

  3. Schuwirth LWT, van der Vleuten CPM. The use of clinical simulations in assessment. Med Educ. 2003;37(Suppl 1):65–71. https://doi.org/10.1046/j.1365-2923.37.s1.8.

    Article  PubMed  Google Scholar 

  4. Bartfay WJ, Rombough R, Howse E, LeBlanc R. The OSCE approach in nursing education: objective structured clinical examinations can be effective vehicles for nursing education and practice by promoting the mastery of clinical skills and decision-making in controlled and safe learning environments. Can Nurse. 2004;100(3):18.

    PubMed  Google Scholar 

  5. Watson R, Stimpson A, Topping A, Porock D. Clinical competence assessment in nursing: a systematic review of the literature. J Adv Nurs. 2002;39(5):421–31. https://doi.org/10.1046/j.1365-2648.2002.02307.

    Article  PubMed  Google Scholar 

  6. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63–7. https://doi.org/10.1097/00001888-199009000-00045.

    Article  CAS  PubMed  Google Scholar 

  7. Downing SM, Haladyna TM. Validity threats: overcoming interference with proposed interpretations of assessment data. Med Educ. 2004;38(3):327–33. https://doi.org/10.1046/j.1365-2923.2004.01777.

    Article  PubMed  Google Scholar 

  8. Stillman PL, Regan MB, Philbin M, Haley HL. Results of a survey on the use of standardized patients to teach and evaluate clinical skills. Acad Med. 1990;65(5):288–92. https://doi.org/10.1097/00001888-199005000-00002.

    Article  CAS  PubMed  Google Scholar 

  9. Endsley M. Measurement of situation awareness in dynamic systems. Hum Factors. 1995;37(1):65–84. https://doi.org/10.1518/001872095779049499.

    Article  Google Scholar 

  10. Flin R, O’connor P, Crichton M. Safety at the sharp end: a guide to non-technical skills: CRC Press; 2017. https://doi.org/10.1201/9781315607467.

  11. Endsley MR, Jones DG. Designing for Situation Awareness: An Approach to Human-Centered Design. 2nd ed. Taylor & Francis; 2012.

  12. Wickens C. Multiple resources and mental workload. Hum Factors. 2008;50(3):449–55. https://doi.org/10.1518/001872008X288394.

    Article  PubMed  Google Scholar 

  13. Endsley M. Theoretical underpinnings of situation awareness: A critical review. In: Endsley MR, Garland DJ, eds. Situation Awareness Analysis and Measurement. CRC Press; 2000:3–32, DOI: https://doi.org/10.1201/b12461.

  14. Endsley MR. Situation awareness: operationally necessary and scientifically grounded. Cogn Technol Work. 2015;17(2):163–7. https://doi.org/10.1007/s10111-015-0323-5.

    Article  Google Scholar 

  15. Baykul Y. Eğitim sisteminde değerlendirme. Hacettepe Üniversitesi Eğitim Fakültesi Derg. 1992;7(7).

  16. Boud D, Soler R. Sustainable assessment revisited. Assess Eval High Educ. 2016;41(3):400–13. https://doi.org/10.1080/02602938.2015.1018133.

    Article  Google Scholar 

  17. Bruff DO, Fisher DH, McEwen KE, Smith BE. Wrapping a MOOC: student perceptions of an experiment in blended learning. J Online Learn Teach. 2013;9(2):187.

    Google Scholar 

  18. García-Peñalvo F, Blanco Á, Sein-Echaluce M. An adaptive hybrid MOOC model: disrupting the MOOC concept in higher education. Telemat Informatics. 2017;35(4):1018–30. https://doi.org/10.1016/j.tele.2017.09.012.

    Article  Google Scholar 

  19. Whitehill J, Mohan K, Seaton D, Rosen Y, Tingley D. Delving deeper into MOOC student dropout prediction. Published online. 2017.

  20. Escudero-Nahón A. Análisis crítico al término “masivo” en los MOOC: una Cartografía Conceptual. Edmetic. 2020;9(1):188–212. https://doi.org/10.21071/edmetic.v9i1.12252.

    Article  Google Scholar 

  21. Wang K, Zhu C. MOOC-based flipped learning in higher education: students’ participation, experience and learning performance. Int J Educ Technol High Educ. 2019;16(1):1–18. https://doi.org/10.1186/s41239-019-0163-0.

    Article  Google Scholar 

  22. Li J, Zhang X, Hu Z. The design and application of Flip classroom teaching based on computer technology. Int J Educ Technol High Educ. 2018;13(10):95–107. https://doi.org/10.3991/ijet.v13i10.9453.

    Article  Google Scholar 

  23. Thomas PA, Kern DE, Hughes MT, Chen BY. Curriculum development for medical education: a six-step approach: JHU Press; 2016.

    Google Scholar 

  24. Benishek LE, Lazzara EH, Gaught WL, Arcaro LL, Okuda Y, Salas E. The template of events for applied and critical healthcare simulation (TEACH Sim): a tool for systematic simulation scenario design. Simul Healthc. 2015;10(1):21–30. https://doi.org/10.1097/SIH.0000000000000058.

    Article  PubMed  Google Scholar 

  25. Harrington DW. Simon L V. Designing a Simulation Scenario. StatPearls.com: Published January; 2020. https://www.statpearls.com/ArticleLibrary/viewarticle/63807

    Google Scholar 

  26. Allen WC. Overview and evolution of the ADDIE training system. Adv Dev Hum Resour. 2006;8(4):430–41. https://doi.org/10.1177/1523422306292942.

    Article  Google Scholar 

  27. Reiser RA, Dempsey J V., eds. Trends and Issues in Instructional Design and Technology. 1st ed. Upper Saddle River , NJ; 2002.

  28. Kirkpatrick D, Kirkpatrick J. Evaluating Training Programs: The Four Levels. Berrett-Koehler Publishers; 2006.

  29. Waltz CF, Strickland OL, Lenz ER. Measurement in nursing research. 2nd ed. Davis Company: F. A; 1991.

    Google Scholar 

  30. Lioce L, Meakim C, Fey M, Victor J, Mariani B, Alinier G. Standards of best practice: simulation standard IX: simulation design. Clin Simul Nurs. 2015;11(6):309–15. https://doi.org/10.1016/j.ecns.2015.03.005.

    Article  Google Scholar 

  31. McWilliam P, Botwinski C. Developing a successful nursing objective structured clinical examination. J Nurs Educ. 2010;49(1):36–41. https://doi.org/10.3928/01484834-20090915-01.

    Article  PubMed  Google Scholar 

  32. Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ. 1979;13(1):39–54. https://doi.org/10.1111/j1365-2923.1979.tb00918.

  33. Wessel J, Williams R, Finch E, Gémus M. Reliability and validity of an objective structured clinical examination for physical therapy students. J Allied Health. 2003;32(4):266–9.

    PubMed  Google Scholar 

  34. Fox R, Dacre J, Mclure C. The impact of formal instruction in clinical examination skills on medical student performance – the example of peripheral nervous system examination. Med Educ. 2001;35(4):371–3. https://doi.org/10.1046/j.1365-2923.2001.00732.

  35. Martin IG, Stark P, Jolly B. Benefiting from clinical experience: the influence of learning style and clinical experience on performance in an undergraduate objective structured clinical examination. Med Educ. 2000;34(7):530–4. https://doi.org/10.1046/j.1365-2923.2000.00489.

  36. Wilkinson T, Newble D, Wilson P, Carter J, Helms R. Development of a three-Centre simultaneous objective structured clinical examination. Med Educ. 2000;34(10):798–807. https://doi.org/10.1046/j.1365-2923.2000.00669.x.

    Article  CAS  PubMed  Google Scholar 

  37. Wass V, Jolly B. Does observation add to the validity of the long case? Med Educ. 2001;35(8):729–34. https://doi.org/10.1046/j.1365-2923.2001.01012.

    Article  CAS  PubMed  Google Scholar 

  38. Regehr G, Freeman R, Hodges B, Russell L. Assessing the generalizability of OSCE measures across content domains. Acad Med. 1999;74(12):1320–2. https://doi.org/10.1097/00001888-199912000-00015.

    Article  CAS  PubMed  Google Scholar 

  39. Bambini D. Writing a simulation scenario: a step-by-step guide. AACN Adv Crit Care. 2016;27(1):62–70. https://doi.org/10.4037/aacnacc2016986.

    Article  PubMed  Google Scholar 

  40. Willhaus J. Simulation basics: how to conduct a high-Fidelity simulation. AACN Adv Crit Care. 2016;27(1):71–7. https://doi.org/10.4037/aacnacc2016569.

    Article  PubMed  Google Scholar 

  41. Childs JC, Sepples S. Clinical teaching by simulation: lessons learned from a complex patient care scenario. Nurs Educ Perspect. 2006;27(3):154–8.

    Google Scholar 

  42. López de la Serna A. Integración de los MOOC en la enseñanza universitaria. El caso de los SPOC. Published online 2016.

  43. Yeh C-C, Chang D-F, Chang L-Y. Information Technology Integrated into Classroom Teaching and Its Effects. Online Submiss. Published online 2011.

  44. Xu A, Chen G. A study on the effects of teachers’ information literacy on information technology integrated instruction and teaching effectiveness. Eurasia J Math Sci Technol Educ. 2016;12(2):335–46.

    Google Scholar 

  45. Ebben M, Murphy JS. Unpacking MOOC scholarly discourse: a review of nascent MOOC scholarship. Learn Media Technol. 2014;39(3):328–45. https://doi.org/10.1080/17439884.2013.878352.

    Article  Google Scholar 

  46. Hew KF, Cheung WS. Students’ and instructors’ use of massive open online courses (MOOCs): motivations and challenges. Educ Res Rev. 2014;12:45–58. https://doi.org/10.1016/j.edurev.2014.05.001.

    Article  Google Scholar 

  47. Jacoby J. The disruptive potential of the massive open online course: a literature review. J Open, Flexible, Distance Learn. 2014;18(1):73–85.

    Google Scholar 

  48. Kennedy J. Characteristics of massive open online courses (MOOCs): A research review, 2009–2012. J Interact Online Learn. 2014;13(1).

  49. Mehrara BJ, Greene AK. Lymphedema and obesity: is there a link? Plast Reconstr Surg. 2014;134(1):154e–60e. https://doi.org/10.1097/PRS.0000000000000268.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  50. Liyanagunawardena TR, Adams AA, Williams SA. MOOCs: A systematic study of the published literature 2008–2012. Int Rev Res Open Distrib Learn. 2013;14(3 SE-Research Articles):202–27. https://doi.org/10.19173/irrodl.v14i3.1455.

    Article  Google Scholar 

  51. Koller D, Ng A, Do C, Chen Z. Retention and intention in massive open online courses: in depth. Educ Rev. 2013;48(3):62–3.

    Google Scholar 

  52. Hu X, Zhang H, Song Y, Wu C, Yang Q, Shi Z, et al. Implementation of flipped classroom combined with problem-based learning: an approach to promote learning about hyperthyroidism in the endocrinology internship. BMC Med Educ. 2019;19(1):1–8. https://doi.org/10.1186/s12909-019-1714-8.

    Article  Google Scholar 

  53. McManus S. Attributes of effective formative assessment. Published online. 2008.

  54. Shepard LA. Classroom assessment. In: Schmeiser CB, Welch CJ, Brennan RL, eds. Educational Measurement. American Council on Education and Praeger Publishers, Westport, CT; 2006:623–646.

  55. Trumbull E, Lash A. Understanding formative assessment. Insights form Learn theory Meas theory San Fr WestEd. Published online 2013:1–20.

  56. Curl ED, Smith S, Chisholm LA, McGee LA, Das K. Effectiveness of integrated simulation and clinical experiences compared to traditional clinical experiences for nursing students. Nurs Edu Perspect. 2016;37(2):72–7.

    Google Scholar 

  57. Harrington SA, Bosch MV, Schoofs N, Beel-Bates C. Anderson K. Quantitative outcomes for nursing students in a flipped classroom. Nurs Educ Perspect 2015; 36(3): 179–181, DOI: https://doi.org/10.5480/13-1255.

Download references

Acknowledgments

The authors are very grateful to St. Martin De Porres Hospital for the support in funding and the team of experts to draft clinical scenarios and design the OSCE checklist, to Chung-Jen Junior College of Nursing, Health Sciences and Management for support in curricula, classroom spaces and equipment, as well as to all the participants for their assistance and co-operation. We would like to thank anonymous reviewers and the editor for their comments.

Conflict of interest

The authors declare no conflict of interest.

Funding

This study was funded by the St. Martin De Porres Hospital (HR54-P1903) and the Allied Advanced Intelligent Biomedical Research Center (A21BRC) under the Higher Education Sprout Project of Ministry of Education.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization, YKO and LPT; Data curation, LPT; Formal analysis, LPT and LPH; Methodology, YKO and THH; Resources, LPT and LPH; Writing – original draft, YKO and LPT; Writing – review & editing, YKO, LPT and THH. All the authors read and approved the final manuscript.

Corresponding author

Correspondence to Yang-Kun Ou.

Ethics declarations

Ethics approval and consent to participate

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board of St. Martin De Porres Hospital (IRB No.18B-012 and Date of Approval: 2019/11/1). Informed consent was obtained from all subjects involved in the study.

Consent for publication

Not Applicable.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tseng, LP., Hou, TH., Huang, LP. et al. Effectiveness of applying clinical simulation scenarios and integrating information technology in medical-surgical nursing and critical nursing courses. BMC Nurs 20, 229 (2021). https://doi.org/10.1186/s12912-021-00744-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12912-021-00744-7

Keywords