- Research article
- Open Access
The objective structured clinical examination as an assessment strategy for clinical competence in novice nursing practitioners in Taiwan
BMC Nursing volume 20, Article number: 91 (2021)
The conventional written tests and professional assessment have limitation in fair judgement of clinical competence. Because the examiners may not have total objectivity and may lack standardization throughout the assessment process. We sought to design a valid method of competence assessment in medical and nursing specialties. This work was aimed to develop an Objective Structured Clinical Exam (OSCE) to evaluate novice nursing practitioners’ clinical competency, work stress, professional confidence, and career satisfaction.
A Quasi-experimental study (pre-post). Fifty-five novice nursing practitioners received the OSCE three-months following their graduation, which consisted of four stations: history taking, physical examination, problem-directed management, interpersonal communication, and the required techniques of related procedures. The examiners had to complete an assessment checklist, and the participants had to complete a pre-post questionnaire (modified from a Nursing Competency Questionnaire, a Stress scale, and Satisfaction with Learning scale).
Among the novice nursing practitioners, 41 of them (74.5 %) passed the exam with a mean score of 61.38 ± 8.34. There was a significantly higher passing rate among nurses who were working in medical-surgical wards (85.7 %) and the intensive care unit-emergency department (77.8 %) compared to novice nursing practitioners working in other units. All the novice nursing practitioners at Station A had poor performance in assessing patients with a fever. OSCE performance was more associated with educational attainment and work unit, rather than the gender. Finally, the participants showed statistically significant increases in their clinical competency, confidence in their professional competence, satisfaction with the clinical practice, and decreased work stress after the OSCE.
We found that the OSCE process had a positive educational effect, in providing a meaningful and accurate assessment of the competence of novice nursing practitioners. An appropriate OSCE program is vital for novice nursing practitioners, educators, and administrators. The effective application of OSCEs can help novice nursing practitioners gain confidence in their clinical skills.
Nursing competency is considered as an integrative ability of clinical knowledge, judgment, skills, attitude, and beliefs to perform specific practice settings in different situations [1, 2]. Competence also reflects the holistic nursing care. Poor nursing competence decreases the quality of care and patient safety [3, 4]. Novice nursing practitioners (NNPs) usually have difficulty practicing competently, which can compromise their quality of patient care [5, 6]. The transition period to achieving competency can be a time of strain for NNPs . Thus, the turnover rate of nurses tends to be high at the start of their careers. Additionally, patients are increasingly interested in customized treatment. Inadequate nursing competency may affect implementation of multidisciplinary team plan. Such issues may be resolved by offering NNPs an appropriate program of competency assessment and in-service education that may improve their competence and help them adapt to their work environment.
Conventional assessment methods used for clinical competence remains a matter of concern because the examiners may not have full objectivity and standardization through the assessment process . Furthermore, it is risky for healthcare institutions to have NNPs demonstrate their ability with real patients and clinical situations during their training period, even though this type of assessment may yield reliable results that reflect the true competence of NNPs. Therefore, appropriate assessment of competence remains an ongoing challenge for responsible institutions in training NNPs. The Objective Structured Clinical Evaluation (OSCE), a multidimensional practical examination of clinical performance, is able to reflect problem-solving abilities, critical thinking and communication skills of healthcare professionals [8, 9] and has been reported to be a feasible method of assessing the competence in undergraduate/postgraduate medical education, paramedical-specialist training, and licensing examinations [8, 10, 11]. It provides for a meaningful alternative strategy as it allows for individual assessments of a total group in a timely, controlled and safe way. Recently, the application of the OSCE to assess clinical skill competency has gained attention in nursing education . The OSCE station content varies according to student experience and the nature of the assessment. The types of problems portrayed in an OSCE are those that students would commonly encounter in a clinic or hospital . Throughout the OSCE, examinees show their clinical competence in a safe clinical scenario and educators can audit the weak or missing competencies of examinees [13, 14]. Hence, OSCE is an assessment that allows examinees to demonstrate their nursing competence in a simulated clinical setting that reflects the clinical competence that nurses need to care for patients [15,16,17,18].
Although using the OSCE to assess clinical competence of NNPs might not truly reflect how nurses will perform in the clinical setting, it remains an important strategy as it falls just short of the optimal practice-based assessment and above the use of written assignments or multiple-choice question. Based on the previous researches, the aim of the study is to investigate the impact of an OSCE program on the learning progression of NNPs.
OSCE setting and participants
We conducted the examination three months after the NNPs started their careers at our institution. We tested their clinical skills related to the issues addressed at each station: history taking, conducting a complete physical examination, problem-directed management, interpersonal communication, and required-procedure techniques (Table 1). We created a 4-station OSCE: (1) Care of fever (Station A), (2) medication administration (Station B), (3) patients with abdominal pain (Station C), and (4) care for intravenous lines (Station D) (Fig. 1).
We chose these four clinical scenarios because nursing staff with over 10 years of clinical experience at our institution reviewed the relevant literature [3, 19,20,21,22] and recommended that the proper handling of these clinical problems can be essential for NNPs at the beginning of their careers. Furthermore, over 90 % of the experts on the OSCE education committee of our institution agreed on the importance and practicality of each clinical scenario. The internal consistency of the OSCE stations was tested using Cronbach’s alpha. The overall reliability of the Cronbach’s α coefficient was 0.791, which indicated good stability and internal consistency, with minor differences in the progression of the indices.
This comprehensive 4-station OSCE was carried out at Chang Gung Memorial Hospital, Keelung, Taiwan between August 2017 and July 2018. The population of the research consisted of the fifty-five NNPs from different work units, who obtained a diploma of bachelor’s degree in Nursing in Taiwan, but had no internship experience in clinical practice as of July 2017. None of the NNPs had ever taken part in an OSCE before this study. All 55 participants completed the following required test items and questionnaires at the end of this study. Before entering the OSCE, NNPs needed to complete the training of core professional skills, a 5-day orientation that included standard training courses that were designed and verified by the Department of Nursing at our institution . Three months after the orientation courses, the NNPs were assessed at end of the module through formative OSCE. The NNPs were familiarized with the OSCE procedure under the guidance of instructors who were nursing staff at our institution with comprehensive training. The instructors encouraged the NNPs to discuss the core and provide feedback to the NNPs about his/her achievements, deficiencies and opportunities for improvement.
Implementation, instruments, and evaluations at the OSCE stations
Each station had one standardized patient (SP) and one examiner. The SP was a person who had completed at least eight hours of standard training provided by the Taiwan Association of Standardized Patients, who was also capable of simulating the signs and symptoms of diseases, mimicking clinical scenarios, and providing feedback to the NNPs. The examiner was a nursing faculty rater who had completed OSCE education training program and was certified by our institution and the Taiwan Nursing Association. The raters acted as passive evaluators and were instructed not to guide or prompt the participants.
At the beginning of the test at each OSCE station, participants had 1 min to read a written description of the required tasks. Participants would spend 10 min at each station, consisting of 8 min of observation and 2 min of immediate verbal feedback from the station examiner (Fig. 1). The examiner would assess the abilities of the participants in terms of their clinical skills, strategies, and interpretation of clinical problems (Table 1), and grade them according to a checklist for each skill. The checklist consisted of 10–12 items that were rated on a 3-point scale: 0 (failed to perform), 1 (performed poorly or out of sequence), and 2 (performed appropriately in the correct sequence). Kendall’s coefficient of concordance was 0.781 (p < 0.0001), indicating that there was a significant correlation between the examiners’ scores; consequently, there was a good agreement between examiners’ and the scorers’ ratings. We also measured certain practices, such as greeting the patient and hand decontamination, but we did not apply these elements to participants’ overall scores. We recorded the sum of the scores from all the checklist items for each station, and the participants received their own performance-analysis report after the OSCE (Fig. 2). The instructors arranged an 80-min debriefing session to review the report and help the NNPs understand the core (i.e. clinically important) elements of the stations. We used the “borderline-group method” to establish the standard “pass” score. The “pass” score was the mean score of the NNPs whose OSCE scores were rated “borderline” at each station .
Participants were required to complete a questionnaire before the implementation (pretest) and after the end of the OSCE program (posttest). The questionnaire was a modified version of a tool used in a previous report, which collected basic learning and personal background information, plus a Nursing Competency Questionnaire (NCQ), a Stress scale, and a Satisfaction with Learning scale . The Nursing Competency Questionnaire, a 26-item instrument using a five-point scale, was designed to evaluate nursing competency. The five domains include taking a medical history (5 questions), physical assessment (3 questions), interpersonal communication (7 questions), problem-directed management (5 questions), and problem-required skills (6 questions). The Stress scale consisted of 10 statements in relation to stressful nursing situations. Each item required respondents to rate the situation on a 5-point scale (1 = not stressful at all to 5 = extremely stressful). The Satisfaction with Learning scale, a 3-item instrument designed to measure the nurses’ satisfaction. Each item contained a statement about nurses’ satisfaction with learning in regards with obtaining input from trainers, using a 5-point scale from 1 (strongly disagree) to 5 (strongly agree). Seven experts, including three attending physicians and four senior nursing supervisors, were invited to validate this questionnaire. A test of internal reliability was conducted with ten senior nurses who had more than 5 years of working experience. Next, the professionals rated its content validity, which yielded a content validity index (CVI) of 0.89–0.91.
Data collection began after the research ethics committee had (IRB approval number:104-9928B) approved the study protocol at the host hospital. Subsequently, we held a meeting with the NNPs to explain the program and the study, including the study’s purpose and procedures, the participants’ rights, and confidentiality. We sent this information, including a covering letter and the questionnaire, to the participants before data collection in a self-addressed stamped envelope. For their convenience, the participants could complete the questionnaires in either the paper or electronic form. The participants returned the questionnaires by mail or emailed them to the research team. We destroyed all the envelopes and deleted all the email addresses that could identify the participants, immediately after the data were saved in a secured computer protected with passwords known only to the primary investigator.
The data were verified and analyzed using the Statistical Package for Social Sciences (SPSS) software, version 21.0 for Windows. Descriptive statistics (mean scores and standard deviations) were obtained for each examination tool, and analyzed by one-sample or two-sample t-tests, or analysis of variance when appropriate. Statistical tables and percentages were used for the presentation of demographic data; the chi-square test and Spearman’s correlation were used to test the significance of associations between demographic variables and competency levels. Continuous data were tested for normality using the Kolmogorov-Smirnov test and presented as means and standard deviations. The internal consistency of the OSCE stations was tested using Cronbach’s alpha. Agreement between the total scores obtained in both tests was analyzed using the Bland-Altman analysis, and associations were measured using Pearson’s correlation coefficient. The level of significance for all analyses was set at 5 % (p < 0.05).
Demographic features of the participants
The characteristics of the participants are shown in Table 2. The examinees came from four different units, they were analyzed by different variables, including gender, educational level, and experiences of previous OSCE. From the 55 participants, 50 were female and 5 were male; the age of all the participants ranged from 20 to 29 years old. The majority of the examinees had graduated from college (74.5 %, n = 41) and half of them worked in medical or surgical wards (50.9 %, n = 28).
NNPs’ evaluation of the OSCE
The results were analyzed using Modified Angoff Method . The passing score of each station and the passing criteria of the OSCE were shown in Fig. 2. We found that 42 NNPs (76.4 %) passed the OSCE with a mean score of 64.62 ± 5.79 (range, 56–79), whereas 13 (23.6 %) participants failed the competency test with a mean score of 48.54 ± 6.33 (range, 37–57). The participants who worked in “Other” units (55.6 %) had a significantly higher failure rate than participants who worked in medical-surgical wards (17.9 %) and the intensive care unit-emergency department (16.7 %, p < 0.05). Table 3 shows that the OSCE performance was associated with educational attainment, gender, and work unit. Regarding the gender difference, the male participants (67.40 ± 8.26) performed better than female ones in general(60.30 ± 8.93; p = 0.009). Overall, the performance of participants in college and above (62.41 ± 8.44) was better than that of Junior college (56.64 ± 9.67; p = 0.038). In concern with the unit difference in station C, the best performance units were critical care and emergency units, followed by medical-surgical and other units (16.33 ± 3.48 vs. 15.32 ± 3.61, 12.44 ± 2.79, p = 0.028). Overall, the average score of participants in the medication administration station (Station B) was higher than that of the other 3 stations. Fewer participants had relatively low pass rates in Stations A and C, particularly for the patient with a fever in station A, of which the NNPs had the lowest average scores (10.42 ± 3 0.00) and the fewest pass rates (10.9 %) (Table 3).
Clinical competence of novice nursing practitioners
Shown in Table 4, the participants had statistically significant increases in their clinical competency, their confidence in professional competence, their satisfaction with clinical practice, and a significant decrease in work stress after taking the OSCE. We found that 14 NNPs who failed this test, gained greater confidence in their competence (10.40 ± 0.1.53 vs. 12.54 ± 0.1.92, p = 0.044) and had greater satisfaction (9.75 ± 0.0.50 vs. 12.29 ± 0.1.54, p = 0.042) after taking the OSCE. No significant differences in confidence, competence, work stress, or satisfaction were found among the nurses who worked in different units (data not shown). Nevertheless, the NNPs working in other units had significantly (p = 0.032) lower scores on clinical competence (58.32 ± 6.32) than NNPs working in the medical-surgical wards (62.75 ± 3.53) and the intensive care unit-emergency department (61.36 ± 2.89).
The current study shows that OSCE delivers a practical strategy for nursing educators and healthcare administrators to improve the clinical ability of NNPs. Through multiple feedback sessions and debriefing, teaching faculty can understand what types of clinical abilities NNPs need to improve in their practice. During the OSCE in our study, the faculty obtained immediate feedback from the NNPs, and could assess the suitability of the current clinical teaching program for the NNPs’ learning. Although preparation for conducting an OSCE is time-consuming and requires careful planning, this teaching program offers valuable assistance in evaluating clinical performance. We found that an OSCE program for NNPs improved clinical competency and reduced work-related stress. In addition, the OSCE helped increase the NNPs’ confidence and reduce their personal embarrassment when encountering similar clinical situations. Previous reports showed that nursing students found that the OSCE help them deal with stressful clinical situations and develop their confidence in clinical practice [24,25,26,27]. Student midwives saw the OSCE as a valid means of assessment and that it increased their confidence in performing clinical skills [26, 28]. These findings support our results that OSCE program is an appropriate method for accurately measuring and effectively addressing weaknesses, in order to improve the competence of NNPs in daily practice.
Some interesting observations and study concerns can lead to further discussion. First, previous research suggested there was a positive correlation between the number of stations and reliability [10, 29]. Four stations were included in this OSCE based upon clinical experts’ experience at our institution. The overall reliability in the present study was 0.791, which is a desirable level of reliability for high-stakes tests, such as certification [30,31,32]. The number of stations in this OSCE was, thus, appropriate to assess the competence of the NNPs in our study. Second, the lowest OSCE score and number of NNPs passing the station tests occurred in Station A- care for fever. Also, participants from other units (e.g., the baby room, delivery room, and operation rooms) performed not well in overall tests and Station C-care for abdominal pain. Through the debriefing session, we found that the NNPs were unable to perform well in history taking and symptom assessment related to the patient’s problems. These results may be attributed to the fact that they had few opportunities to encounter patients with these clinical scenarios during the first three months of their careers. To improve the clinical ability of NNPs who failed, we arranged interactive teaching sessions in work units. Senior instructors with over 5 years of clinical experience provided opportunities for NNPs to work with patients with clinical problems and guided them in interacting with the patients, including, questioning patients and reflecting on what they had learned. The instructors also focused on teaching NNPs how to recognize the signs and symptoms, understand daily medication regimens, interpret abnormal laboratory data, and facilitate communication between the nurses and patients. Likewise, since all 5 male-gender NNPs in this study came from either medical-surgical wards or the intensive care unit-emergency department, they had a better chance to encounter medical events presented at OSCE stations and consequently gained better score performance. Lastly, Saito et al. reported that adoption of the OSCE in medical education is effective for the training of medical students by developing necessary basic skills in both technical and behavioral aspects, and it enables educators to guide students toward the appropriate integration of knowledge, skills, and behavior . Even though some NNPs failed the OSCE, they still benefited from it in terms of confidence in clinical competence and satisfaction with OSCE learning in our study. The feedback section following assessment of performance of students is a vital element in their learning process. We believe that it is most effective to improve competence if given immediately after examination. The OSCE itself provided NNPs with a bi-directional feedback mechanism to measure their strengths and weaknesses in clinical skills.
The major limitation of this study lies on the fact that it did not compare the OSCE to the conventional methods of assessing the competence of NNPs. However, the OSCE supplanted the direct observation of actual patients and offered a sound assessment of competence and improvements among NNPs in the current study. And the OSCE offers an objective and standard tool to assess multifaceted clinical ability of NNPs in a close-to-clinic situation. Also, it is an appropriate method for accurately measuring and effectively addressing the weaknesses, in order to improve the competence of NNPs in daily practice.
Taken together, our results further support the notion that OSCE training, with efficient interactive communication, is mutually beneficial to the NNPs and the training staff involved in the learning process. This educational approach requires robust design based on sound pedagogy to assure practice and assessment of holistic nursing care.
A well-designed OSCE has a positive educational impact, offer an appropriate professional assessment, and helps NNPs gain confidence and improve their clinical competence. We believe that OSCE is an effective and authentic mode of assessment and can be applied to other levels of nurses as well.
Availability of data and materials
The datasets generated and/or analyzed during the current study are available from the corresponding author on reasonable request.
Intensive Care Unit
Novice nursing practitioners
Nursing Competency Questionnaire
Objective Structured Clinical Exam
Brown RA, Crookes PA. What are the ‘necessary’ skills for a newly graduating RN? Results of an Australian survey. BMC Nursing. 2016;15:1–8.
Chen Y, Roger W. A review of clinical competence assessment in nursing. Nurse Educ Today. 2011;31:832–6.
Chen SH, Chen SC, Lee S, Chang YL, Yeh KY. Impact of interactive situated and simulated teaching program on novice nursing practitioners’ clinical competence, confidence, and stress. Nurse Educ Today. 2017;55:11–6.
Kieft RA, de Brouwer BB, Francke AL, Delnoij DM. How nurses and their work environment affect patient experiences of the quality of care: a qualitative study. BMC Health Services Research. 2014;14:249–59.
Duchscher JEB. Transition shock: the initial stage of role adaptation for newly graduated Registered Nurses. J Adv Nurs. 2009;65:1103–13.
Lin PS, Viscardi MK, McHugh MD. Factors influencing job satisfaction of new graduate nurses participating in nurse residency programs: a systematic review. The of Journal Continuing Education in Nursing. 2014;45:439–50.
Missen K, McKenna L, Beauchamp A. Graduate nurse program coordinators’ perceptions of role adaptation experienced by new nursing graduates: a descriptive qualitative approach. Nurse Educ Practice. 2014;4:134–42.
Harden RM. Revisiting ‘Assessment of clinical competence using an objective structured clinical examination (OSCE)’. Med Educ. 2016;50:376–9.
Newble D. Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ. 2004;38:199–203.
Hodges BD. The objective structured clinical examination: Three decades of development. Journal of Veterinary Med Educ. 2006;33:571–7.
Oranye NO, Ahmed C, Ahmed N, Abu Bakar R. Assessing Nursing Clinical Skills Competence through Objective Structured Clinical Examination (OSCE) for Open Distance Learning Students in Open University Malaysia. Contemporary Nurse. 2012;41:233–41.
Rushforth H. Objective structured clinical examination (OSCE): review of literature and implications for nursing education. Nurse Educ Today. 2007;27:481–90.
Schuwirth LW, van der Vleuten CP. The use of clinical simulations in assessment. Med Educ. 2003;37:65–71.
Hsu CM, Hsiao CT, Chang LC, Chang HY. Is there an association between nurse, clinical teacher and peer feedback for trainee doctors’ medical specialty choice? An observational study in Taiwan. BMJ open. 2018;8:e020769.
Hamdy H. Blueprinting for the assessment of health care professionals. Clin Teach. 2006;3:175–9.
Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Educ. 2005;27:10–28.
Luctkar-Flude M, Wilson-Keates B, Larocque M. Evaluating high-fidelity human simulators and standardized patients in an undergraduate nursing health assessment course. Nurse Educ Today. 2012;32:448–52.
Yanhua C, Watson R. A review of clinical competence assessment in nursing. Nurse Educ Today. 2011;31:832–6.
Brighton R, Mackay M, Brown RA, Jans C, Antoniou C. Introduction of undergraduate nursing students to an objective structured clinical examination. J of Nur Educ. 2017;56:231–4.
Chiou-Rong H, Ue-Lin C. Objective Structured Clinical Examinations Have Become a Challenge for Nursing Education in Taiwan. Annals of Nursing Practice. 2015;2:1–2.
Osaji TA, Opiah MM, Onasoga OA. OSCE/OSPE: A Tool for Objectivity in General Nursing Examination in Nigeria. J of Research in Nur Midwifery. 2015;4:47–52.
Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Family Med. 2008;40:574–8.
Angoff WH, Scales. Norms, and Equivalent Scores. In: Thorndike RL, editor. Educational Measurement. 2nd ed. Washington, DC: American Council on Education; 1971. pp. 508–600.
Brosnan M, Evans W, Brosnan E, Brown G. Implementing objective structured clinical skills evaluation (OSCE) in nurse registration programmes in a center in Ireland: a utilization focused evaluation. Nurse Educ Today. 2006;26:115–22.
Barry M, Noonan M, Bradshaw C, Murphy-Tighe S. An exploration of student midwives’ experiences of the Objective Structured Clinical Examination assessment process. Nurse Educ Today. 2012;32:690–4.
Mitchell ML, Henderson A, Groves M, Dalton M, Nulty D. The objective structured clinical examination (OSCE): optimizing its value in the undergraduate nursing curriculum. Nurse Educ Today. 2009;29:398–404.
Lee KL, Tsai SL, Chiu YT, et al. Can student self-ratings be compared with peer ratings? A study of measurement invariance of multisource feedback. Adv Health Sci Education Theory Practice. 2016;21:401–13.
El Darir AS, Abd El Hamid NA. Objective structured clinical examination versus traditional clinical student’s achievement at maternity nursing; a comparative approach. J of Dental Med Sciences. 2013;4:63–8.
Yamada T, Sato J, Yoshimura H, Hiraoka E, Shiga T, Kubota T, Fujitani S, Machi S, Ban N. Reliability and acceptability of six station multiple miniinterviews: past-behavioural versus situational questions in postgraduate medical admission. BMC Med Educ. 2017;17(1):57.
LaRochelle J, Durning SJ, Boulet J, van der Vleuten C, van Merrienboer J, Donkers J. Beyond standard checklist assessment: Question sequence may impact student performance. Perspectives on Med Educ. 2016;52:95–102.
Takahashi SG, Rothman A, Nayer, Urowitz MB, Crescenzi AM. Validation of a large-scale clinical examination for international medical graduates. Can Fam Physician. 2012;58(7):e408-17.
Lee J, Lee Y, Lee S, et al. Effects of high-fidelity patient simulation led clinical reasoning course: Focused on nursing core competencies, problem solving, and academic self-efficacy. Jpn J Nurse Science. 2016;13:20–8.
Saitoh E, Kanada Y, Tomita M, et al. The Objective Structured Clinical Examination (OSCE) for Physical Therapist and Occupational Therapist. Tokyo: Kanahara Publication; 2011. pp. 3–5.
Carroll JG, Monroe J. Teaching medical interviewing: a critique of educational research and practice. J Med Educ. 1979;54(6):498–500.
The authors would like to thank Prof. Lynn Monrouxe for her assistance in reviewing this manuscript. Also, the preliminary result was presented in AMEE (An International Association for Medical Education) conference in 2019.
This research was funded by the Ministry of Science and Technology, Taiwan (MOST 105-2511-S-182 A-001 MY3). The funding body was not involved in the design of the study.
Ethics approval and consent to participate
The study protocol (institutional review board approval number: 106-2673 C) was approved by a local research ethics committee at the Chang Gung Memorial Hospital in Taiwan. Written informed consent was obtained from all individual participants included in the study.
Consent for publication
Not applicable as no individual personal data is presented.
The authors declare that they have no competing interests and no financial relationship with other organizations sponsoring this research.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Chen, SH., Chen, SC., Lai, YP. et al. The objective structured clinical examination as an assessment strategy for clinical competence in novice nursing practitioners in Taiwan. BMC Nurs 20, 91 (2021). https://doi.org/10.1186/s12912-021-00608-0
- clinical competence
- new nurses
- occupational stress
- objective structured clinical exam (OSCE)