This longitudinal, mixed methods study used a phenomenographical approach which offers a more sophisticated and complex understanding of teaching and learning because it includes what is happening with both the tutors and the students [14,15,16]. Structured non-participant observation was used to observe the amount of student focused teaching in each school at baseline.
Participants
All students from all 14 MCHA schools were asked to participate in the study. Given the large number of students (750) self-administered questionnaires were used to determine student satisfaction with the learning and teaching environment. Each school takes approximately 40–60 students in each cohort with a total of 1258 h of taught theory from 10 teachers per school. At the request of the MoHS, all 14 schools were included in the programme to prevent any school being potentially disadvantaged. In addition to the tutor training, each school received skills room equipment and audio-visual aids to assist tutors in developing more interactive teaching methods.
Data collection
From 2013 to 2014 four education experts from the research team facilitated four district based, 2-day workshops for 10 core teaching staff (including the School coordinator) from each of the 14 MCHA schools (140 core staff in total). There were 35 tutors in each work shop. None of the MCHA tutors had a formal teaching qualification.
The workshop aimed to develop tutors understanding of teaching and learning theories and how to apply these in practice; to understand the concept of student focused learning; to understand how to apply different teaching methods within the classroom and to develop skills in reflective practice. The workshops covered active teaching and learning methods, lesson planning, writing learning outcomes, reflective practice, supportive supervision, mentorship and effective learning environments. During the final session of the 2-day workshop tutors developed an action plan for their own school to implement their learning from the workshop into practice. In addition, tutors also took part in formative teaching practice both in the classroom and simulation or skills room.
All workshops were completed within a two-week period and first follow up observations conducted 3 months afterwards. It was anticipated that this would allow tutors time to start implementing changes in some of their lessons. Observed classes included only those tutors who had been to the workshop.
A descriptive analysis of the age and entry qualification of MCHA students at the time of the study was conducted by reviewing admission records. This was to establish compliance with admission criteria as there is a high potential that less qualified students (on entry) may find it more difficult to cope with the curricula.
Prior to the tutors training programme a baseline visit was conducted for each of the 14 MCHA schools in August–September 2013. Three follow-up visits were planned to all 14 MCHA schools across Sierra Leone. The follow-up visits were timed to coincide with key dates during the MCHA training.
First follow-up was at the end of the students first year (3 months from baseline). Second follow-up occurred immediately prior to their second year examinations (6 months from baseline). Final follow-up was conducted at the end of the programme (12 months from baseline).
Due to the Ebola epidemic in Sierra Leone which occurred from May 2014 to December 2015 the MCHA schools were closed in August 2014. Only 10 of the 14 schools could be visited for the second follow-up and no schools could be visited for the third follow-up. Final results are based on the 10 schools seen at second follow-up only.
At each visit (baseline, 3 months and 6 months), both qualitative and quantitative methodology was used to gain as full a picture as possible of the teaching and learning within each school and allow for triangulation of data. Data collectors worked in pairs to administer the student questionnaire but conducted independent observation of the teaching using a standardised data collection tool.
The aim was to observe two one-hour teaching sessions at each visit in each MCHA school (2x14x3 visits = 84 observations) and ask 750 students to complete a questionnaire at each visit (3 × 750 = 2250 questionnaires). Outcome measures were student evaluation of teaching and learning an increase in student focused sessions, reduction in the use of didactic teaching methods and increase in student learning.
Both tutors and students were provided with written and verbal information about the study by the research team and asked to sign a consent form if they agreed to participate.
Observation of teaching
A structured non-participant observation method was used to observe teaching sessions in each school at each visit [17]. A full explanation was given to the MCHA tutors on the purpose of the planned visits to their schools by the research team and their consent for this sought. Event sampling was used to select the teaching sessions and keep disruption of the normal school timetable to a minimum [18]. There is an underlying assumption when using structured observation that the researchers are familiar with, and understand, the activity being observed [19]. Each member of the research team was involved with and experienced in teaching and learning at a pre and/or post registration level. A modified pre-designed structured observation form was used that assessed teaching methods, student learning and student involvement.
Researchers also took notes during the observations which were subsequently transcribed. Transcription was completed by the same members of the research team to aid with familiarisation of the data which is a key aspect of the framework approach [20]. One member of the research team independently coded the transcripts from four schools which gave an initial coding framework. Though a deductive approach was used to provide an initial structure for the observations an inductive open- coding approach was taken for the final coding [20]. Informal feedback was provided to tutors after the observations by the research team.
Student questionnaires
Students were asked to evaluate the teaching and learning within their own school through an anonymised self-administered questionnaire. Questionnaires were adapted for language and clarity in partnership with MoHS from those used at the Liverpool School of Tropical Medicine to obtain student feedback on teaching. Students rated the lessons in three areas, teaching methods, student learning and student involvement, as these are thought to be the key components which influence teaching and learning [Clark 11]. A 19 question, 5-point Likert scale was used. Basic demographics including the student’s highest academic qualification on entry to the programme and their age were also obtained. Students completed the questionnaire immediately after the lesson being observed. MCHA tutors were asked to leave the classroom during the completion of the questionnaire.
Data analysis
Qualitative data analysis was completed using the Framework Analysis approach. This approach is useful where data covers similar topics or key issues and so can be categorised [20]. In this study the structured observation form used a deductive approach to pre-select key themes of teaching and learning which could be considered as the key issues [20]. It was expected that other sub themes may also occur at follow-up visits which would be incorporated into the Framework analysis.
All student questionnaires were electronically scanned and processed using Formic®. SPSS version 22 was used for analysis. New random samples were selected on each occasion, so most students completed a questionnaire at just one of the three assessment times; these samples were considered to be statistically independent. One-way analyses of variance (ANOVA) were used to evaluate the changes in mean student satisfaction scores across the three assessment points; statistical significance was set at the conventional alpha level of 5% (p ≤ 0.05). Each of the test items (questions) were analysed individually and then the total score across all 19 items was evaluated; Cronbach’s alpha (coefficient of reliability) was calculated to ensure adequate internal reliability of this total score.