Skip to main content

Where do nursing students make mistakes when calculating drug doses? A retrospective study



Research internationally shows that nursing students find dosage calculation difficult. Identifying the specific aspects of dose calculation procedures that are most commonly associated with errors would enable teaching to be targeted where it is most needed, thus improving students’ calculation skills. The aim of this study was to analyze where specifically nursing students make mistakes when calculating drug doses.


Retrospective analysis of written examination papers including dosage calculation exercises from years 1, 2, and 3 of a nursing degree program. Exercises were analyzed for errors in relation to 23 agreed categories reflecting different kinds of calculation or steps in the calculation process. We conducted a descriptive and bivariate analysis of results, examining the relationship between the presence of errors and the proportion of correct and incorrect final answers.


A total of 285 exam papers including 1034 calculation exercises were reviewed. After excluding those that had been left blank, a total of 863 exercises were analyzed in detail. A correct answer was given in 455 exercises (52.7%), although this varied enormously depending on the type of exercise: 89.2% of basic dose calculations were correct, compared with just 2.9% of those involving consideration of maximum concentration. The most common errors were related to unit conversion, more complex concepts such as maximum concentration and minimum dilution, or failure to contextualize the answer to the clinical case. Other frequent errors involved not extracting the key information from the question, not including the units when giving their answer, and not understanding the question. In general, fewer errors in basic dose calculations were made by students at later stages of the degree program.


Students struggle with more complex dose calculations. The main errors detected were related to understanding the task and the key concepts involved, as well as not following the correct steps when solving the problem.

Peer Review reports


Dosage calculation errors can have serious consequences when administering medicines, and hence numeracy skills are crucial for ensuring safe medication management [1]. A study by Ross et al. [2] in a pediatric teaching hospital found that 8% of medication incidents involved a tenfold error in the dose administered. More recently, a systematic review of intravenous admixture drug preparation errors found that the reported incidence of wrong doses across the studies reviewed ranged from 0 to 32.6%, while for wrong diluent volume the range was 0.06 to 49.0%. The authors highlighted the need in future research to develop standardized definitions for these types of errors so as to facilitate a better understanding of where they happen within the drug preparation process and to devise ways of avoiding them [3]. This underlines an issue raised by other authors, namely that it is difficult to know whether an error in the final dose administered is due solely or primarily to miscalculation, insofar as it may be the result of an error further back in the process, such as pharmacy mislabeling or incorrect or unclear prescribing [2].

Research focusing on nursing students suggests they often have poor drug calculation skills, due especially to difficulties with understanding mathematical principles [4]. Accordingly, although they are usually able to perform simple calculations, they struggle with tasks involving multiple steps and which require a higher level of conceptual knowledge [5]. A further issue concerns the extent to which nursing students are aware of their errors. In this regard, a recent study in which students were asked to indicate their level of certainty about answers given to a pharmacology knowledge questionnaire concluded that there was a high risk of medication administration error in 14% of the students who rated incorrect answers with high certainty [6].

Aside from their arithmetic skills, there are a number of other factors that may contribute to nursing students’ difficulties with dose calculation. One of these is math anxiety, which can undermine their ability to understand and complete tasks involving mathematics [7]. A fear of math, resulting in resistance to learning math for medication administration, has also been noted in research using focus groups to explore students’ own perspectives on learning math for medication calculation [8]. Other themes that emerged in the same study were: resentment among students towards what they perceive as ‘complicated’ math; lack of confidence among students leading to a fear of error in clinical practice; a recognition among students that they need to be more self-directed in developing their math skills; and the need for clinical instructors to be consistent in giving students the opportunity to practice calculations in the clinical setting [8].

Notwithstanding the difficulties that students experience with dose calculation, the literature suggests that the problem is far from unsurmountable. Indeed, various studies have reported the effectiveness of workshops or web-based platforms designed to support students’ learning and improve their medication calculation skills [9,10,11,12,13].

Another issue to consider, and one highlighted in a recent comparative study of six European countries [5], concerns cross-national differences in medication education regulations and practices and the competences that graduating nurses are expected to have acquired. For instance, some professional bodies such as the UK Nursing & Midwifery Council recommend that nurse education institutions should require students to achieve a 100% pass on a health numeracy assessment including calculation of medicines [14]. In our country, Spain, a strict criterion such as this is not applied within nurse education programs, although safe medication management is considered a key competence for students to acquire during their pre-registration university training [15].

Whatever the requirements and approach to training, it is vital that nursing students acquire adequate numeracy skills, as once they enter professional practice they will be responsible for administering medication. The importance of their becoming proficient in this respect by the time they graduate is underlined by research suggesting that difficulties with drug calculation often persist among registered nurses [16,17,18]. From the perspective of nurse education, therefore, it is crucial to identify where students struggle the most when it comes to dosage calculation. Despite this, few studies have examined in detail the specific aspects of the drug calculation process where students make mistakes [7, 19,20,21].

Given that the dose calculation skills of both student and registered nurses is an issue of international concern, and one that has implications for patient safety, our aim in the present study was to conduct an in-depth analysis of the kinds of errors that undergraduate nursing students make when performing dose calculation exercises. Identifying the specific aspects or steps in the process they find most difficult would enable nurse educators to target teaching where it is most needed.



This was a descriptive retrospective study in which we reviewed all the examination papers that included at least one dose calculation exercise and which had been submitted over two academic years (2017-18 and 2018-19) by students from years 1, 2, and 3 of a nursing degree program at our university.


In Spain, nursing degrees last 4 years, and successful graduates are eligible to perform independently oral and higher risk medication management, including intravenous injections and infusions [15]. Nursing students at our university begin to be taught dose calculation and medication administration in year one of the degree program. This is done through both theory classes and low-fidelity simulation, totaling 6 hours. At this stage of their training, students learn how to administer medication through different routes (e.g., oral, IV) and to make simple dose calculations on paper (exercises such as “Basic dose calculation” or, “Infusion rate”; see Supplementary material 1 for definitions). In year 2, students attend a 2-hour theory class in which they are required to perform the same exercises as in year 1, as well as new types of calculation such as “Unit conversion”, “Dose according to patient’s weight”, and “Dose calculation involving a percentage”. In year 3, theory classes and low-fidelity simulation (totaling 16 hours) are used to provide students with further practice in the aforementioned types of calculation exercise and different routes of drug administration, this time incorporating more advanced aspects such as the use of infusion pumps. In year 4, students do not receive classroom instruction in dose calculation or medication administration as the year is spent almost entirely on clinical placement.

Students begin clinical placements in year 1 of the degree program, following completion of all the theory classes. Over the 4 years of their studies, they complete a total of 2300 hours on placement. Because students are assigned to different clinical settings, the experience they gain in relation to dose calculation and medication administration may vary.

Examination papers reviewed

All the examinations reviewed were time limited and had been sat in the presence of an invigilator, subsequent to having received the instruction corresponding to each course year (see Setting). For year 1 students, the dose calculation exercise formed part of a written station of an objective structured clinical examination (OSCE). The calculation exercises in year 2 were part of a written exam and consisted of 10 questions of varying difficulty and complexity. This part of the exam had been purposely designed to include some calculation exercises that went beyond the level required of year 2 students, and thus they would not necessarily be expected to answer them all correctly (exercises 7, 8, and 9 in the Year 2 block of Table 1). The rationale for including more difficult problems was that we would then be able to track the progress of individual students by setting them the same questions (changing only the numerical values involved) in years 3 and 4 of their studies (the questionnaire used is shown in Supplementary material 2). This progress monitoring is not part of the present analysis. Finally, the exercises corresponding to year 3 formed part of an ordinary written exam and of an OSCE. Table 1 summarizes the exercises used in each of the 3 years; further details, with examples, are given in Supplementary material 3.

Table 1 Types of exercises included in the analysis


In a first step, we selected a random sample of papers covering all the different exams. In the absence of an existing rubric for classifying errors, two members of the research team, working independently, then carried out an initial content analysis in order to categorize the different types of error made by students in each exercise. This provisional set of categories was then agreed with the rest of the team and checked for clarity and relevance by asking each team member to apply it to a small sample of exam papers. This process yielded a consensus list of 23 categories that were used to analyze the total sample of papers (see Supplementary material 3). For this final analysis, and given the large number of records, the exam papers were distributed among pairs of researchers, who first reviewed them individually before comparing and discussing their evaluation with that of the co-evaluator so as to reach a consensus decision. The task in each case was to record 1) whether the category was applicable or not to a particular exercise, and 2) whether the student’s answer took into account the aspect referred to by the category, and if so whether they did so correctly, partially or incorrectly. At this stage in the process, papers were coded so that only the evaluator knew the identity of the student. The results of this analysis were recorded using a spreadsheet, which was then reviewed by the principal investigator (PI) to check for any inconsistencies or errors (e.g., the category regarding unit conversion was wrongly recorded as being applicable to an exercise that did not require this calculation); in the event that a problem was identified, the PI asked the evaluator who had reviewed the corresponding exam paper to make the necessary correction(s) to the database. Once the accuracy of the database had been checked, its contents were fully anonymized by deleting the aforementioned codes.

Data analysis

Any exercises that were left blank and had not been attempted by the student were excluded. For the descriptive analysis we calculated absolute and relative frequencies, and where appropriate the mean and standard deviation. Bivariate analysis using either the chi-square or Fisher’s exact test, as appropriate, was then conducted to examine whether there were significant differences between the proportion of correct and incorrect answers depending on the presence of errors in each of the aspects (categories) analyzed. McNemar’s test was used for the analysis of paired data. All data analyses were performed using SPSS for Windows 21. When categorizing errors, the researchers sometimes added qualitative comments either to clarify the nature of the error or to justify their choice of category. Those comments that clarified the nature of errors or which provided extra information about them were logged and are considered in the presentation of results.

For years 2 and 3, where more than one exercise was analyzed for each student, we calculated the overall mark out of 10 so as to have an overview of the student’s level of knowledge and to facilitate discussion of results. However, we also analyzed the individual results for each exercise.

Ethical considerations

Approval for the study was granted by both the Department of Nursing and the Research Ethics Committee of the Universitat Internacional de Catalunya (ref. INF-2018-05). The need for informed consent was waived by the ethics committee due to the retrospective nature of the study (see Procedure section). All methods were carried out in accordance with relevant guidelines and regulations.

In preparing the present article, we referred to the STROBE checklist of items that should be included in reports of descriptive retrospective studies [22].


We reviewed 285 examination papers that included 1034 calculation exercises. After excluding those exercises that had been left blank (n = 171), a total of 863 exercises were analyzed. Table 2 shows the number of exercises reviewed and analyzed for each of the three course years.

Table 2 Distribution of calculation exercises reviewed and analyzed by course year

Of the exercises analyzed, 455 (52.7%) were answered correctly. Table 3 shows the number of each type of exercise that were attempted and the number that were correct. Overall, a correct answer was given in 28.4% of the clinical case exercises in year 1, in 50.9% of the exercises in year 2 (mean score of 5.2 (SD 2.2) out of 10), and in 41.8% of those in year 3. The mean score across the three pediatric exercises in year 3 (final three rows in Table 3) was 4.6 (SD 3.2) out of 10.

Table 3 Calculation exercises attempted and those answered correctly in the sample analyzed

Regarding the method used to solve the calculation problems (not including the 69 exercises corresponding to theoretical unit equivalences), in 67.1% (n = 533) of cases the student used the rule-of-three method, in 18.0% (n = 143) a conversion factor (i.e. a number for changing given units to desired units), and in 3.1% (n = 25) both these methods. In 93 exercises (11.7%), the student gave an answer directly without using either of these methods. Overall, 90% of the exercises analyzed were considered complete as the student indicated a final answer to the problem, although the result was only correct in 52.7% of cases.

Most common errors

Table 4 shows the different aspects of the calculation exercises where errors were observed, distinguishing between answers that were ultimately correct or incorrect. For each of these aspects or categories, we examined statistically the relationship between the presence of errors and the proportions of correct and incorrect final answers. It can be seen that with the exception of two categories (i.e., consideration of the diluent solution, and error carried forward), the presence of errors was associated with a significantly higher proportion of incorrect final answers. The areas where errors were most frequently observed concerned an understanding of percentages and unit equivalences (51.6 and 37.7%, respectively), as well as calculations involving more advanced concepts such as maximum concentration and minimum dilution (48.7%). We also found, based on those exercises where it could be analyzed, that in around a third of cases (32.9%) students did not check whether their answer made sense and was realistic, with a related problem being failure to contextualize their result to the patient in question (25.3%).

Table 4 Aspects of the exercises analyzed where errors were observed

Importantly, students also had difficulties with more basic aspects such as calculating the IV infusion rate (37.7%), the infusion time (29.6%), and the volume of solution in which the drug should be dissolved (23.9%). Furthermore, in 51.3% of cases, students did not correctly extract the key information from the question when trying to solve the problem, and this was associated with significantly more incorrect answers (p < .001). Finally, we found that in 16.9% of the exercises analyzed, the student had clearly not understood the task; this was more commonly the case with more difficult exercises or calculations involving more than one step.

Comparison of exercises across the three course years showed that the overall number of errors tended to decrease as students progressed through the degree program, except in relation to performing a mathematical calculation (p = .997), for which the proportion of exercises containing an error remained fairly stable (between 5 and 9%). However, the proportion of students who failed to understand the question increased across course years as the exercises set became more complex (year 1: 5.7%; year 2: 24%; year 3: 37.7%; p < .001), with a similar trend being observed for the percentage of students who did not follow the correct steps in solving the problem (year 1: 35.2%; year 2: 29.3%; year 3: 52.6%; p = .043).

Answers according to type of exercise

Unit equivalences

The unit equivalences (theory) exercise in year 2 comprises five questions, and only 8 (11.6%) students answered them all correctly. The unit equivalence they were most familiar with was g – mg (84%), followed by ml – cc, ml – microdrops (both 65.2%), mcg (μg) – mg (52.2%), and, finally, the percentage equivalence, % = mg/ml (18%).

Exercises referring to an adult patient

Unit conversion

Consistent with the overall results for the unit equivalences (theory) test, we found that 40.6% of students in year 2 made a mistake when converting units in the clinical exercises. The evaluators commented that two students obtained an incorrect answer because they took the abbreviation mcg to mean microdrops (it should be noted here that the Spanish word for drop is gota, which led these students to misinterpret the letter g in mcg). Another comment made was: “they get a wrong answer because they don’t know how to convert mg to mcg”. Notably, one student failed to double-check an answer that, in practical terms, was completely unrealistic (dose to administer of 10−6 mg).

Drug concentration calculations

In the dose calculation exercise involving a percentage (year 2), 90.6% of students obtained a correct answer, much higher than the proportion who, in the theory test, knew that % = mg/ml. It should be noted that % was defined in this practical exercise.

When asked to calculate the drug concentration over time (exercise above the knowledge level expected of year 2 students), 25.6% of students did not calculate the infusion time, 20.9% did not understand the question, and 27.9% made an error that they then carried forward in their calculation. In their comments, the evaluators noted that on six occasions the final answer was incorrect due to rounding or use of a recurring decimal. Other comments of note included: “treats the diluent as part of the drug dose”, “doesn’t apply the rule-of-three method correctly”, and “calculation is correct, but understands minutes instead of hours”.

One of the exercises in year 3 asked students to maintain a prescribed dose for a different drug concentration and infusion rate. The most common errors here were a failure to understand the question (48%) and not knowing how to calculate the infusion rate (50%). In their comments, evaluators noted that 13 students did not how to apply the rule-of-three method, and four did not correctly extract the information from the question, leading to wrong answers.

Basic dose calculation and administration rate

In the clinical case exercise in year 1, students had to calculate both the dose of a prescribed IV drug from the stock available and also the corresponding infusion rate. It can be seen in Table 5 that only 36% of students answered both parts of this exercise correctly, and in most cases this was due to difficulty calculating the infusion rate (p < .001).

Table 5 Results for the clinical case calculation in year 1

The proportion of year 2 students who correctly calculated the dose of an IV drug was higher than in year 1 (correct: 69.3% in year 1 vs. 84.2% in year 2), although they continued to struggle with the calculation of infusion rate (correct: 41.3% in year 1 vs. 14.1% in year 2).

Regarding the comments made by evaluators about students’ dose calculations, it was noted that six students in year 1 “were confused about the drug stock and thought it was a powder vial rather than a liquid ampoule”. Examples of comments made about the year 2 exercises included: “gets a wrong answer due to rounding in a previous calculation”, “gives the wrong units”, and “makes a mistake when multiplying”. With respect to the infusion rate calculations, comments related to the year 1 exercise included “mistakes intermittent infusion for continuous infusion”, “forgets to add the drug volume to the total amount of IV solution”, and “chooses the wrong kind of IV solution”, while those for year 2 exercises included “performs the calculation in ml/h instead of drops/minute”, “uses the wrong IV infusion set”, and “doesn’t know how to convert units”.

Exercises referring to a pediatric patient

The first three exercises analyzed here come from year 2 of the degree program, while the clinical case exercises correspond to year 3 students. Only the first of the 3 year 2 exercises (Dose calculation by patient’s weight) corresponded to the knowledge level expected of year 2 students.

Dose calculation by patient’s weight

The majority of students were able to perform this calculation correctly. Comments made by evaluators regarding incorrect answers included: “divides instead of multiplies so gets it wrong”, “doesn’t know how to do the calculation”, and “wrong units”.

Total infusion time, taking into account the maximum flow rate

The large majority of errors here were due to a lack of understanding of the concepts maximum concentration and minimum dilution (52.8%), and to confusion between intermittent and continuous infusion (54.7%). Overall, 56.6% of students were unable to calculate the total infusion time.

Volume of diluent, taking into account the maximum concentration

Only two students (5.3%) correctly answered this exercise. Most of the errors were related to not understanding the concepts maximum concentration or minimum dilution (94.7%) and to not knowing the appropriate volume of diluent that should be used (73.7%). In addition to noting that students didn’t know how to perform the calculation, evaluators also commented that many of them only calculated the drug dose (“only calculates the ml of aciclovir”) and failed to calculate the corresponding volume of diluent.

Clinical case exercises

We analyzed two clinical pediatric exercises from year 3, both of which required students to make calculations involving more than one step. One of these cases comprises two parts, which were analyzed separately.

The most frequent errors were not contextualizing their answer to the case (between 24.6 and 73.3% of answers, depending on the exercise), not checking that the result was realistic and made sense (between 31.2 and 65.6%), and not fully understanding the question (between 7.8 and 26.2%). It should be noted that in one of the exercises, 14.1% only partially completed the mathematical calculation and thus could not obtain a final result. In the other two exercises, some students (6.3 and 14.8%, respectively) did not know how to calculate the appropriate amount of diluent. Finally, and related to the fact that these exercises involved multi-step calculations, we found that although the correct steps were followed by between 52.5 and 100% of students (depending on the exercise), the final result was incorrect in between 42.1% of 63.9% of cases due to an error being carried forward.


This article presents a detailed analysis of the errors made by nursing undergraduates when performing written dose calculation exercises. The results add to existing evidence regarding the kinds of problems that nursing students have with calculation exercises [21]. Our analysis suggests that students’ overall level of calculation skills is limited, although they are generally able to perform basic dose calculations. The proportion of students who correctly answered the exercises set was slightly below that reported in some studies [19, 23, 24], but similar to that observed by Bagnasco et al. [4]. However, our results should be interpreted with caution as a small number of the year 2 exercises we analyzed included questions that went beyond the level of knowledge that students were expected to have at this point in their studies; as noted above in Method (sub-section Examination Papers Reviewed), these more difficult problems were deliberately included as a way of enabling us to track the progress of individual students in subsequent years of the degree program. If we omit these three questions from the analysis, the mean grade obtained by students on this exam increases from 5.2 (SD 2.2) to 6.3 (SD 2.5) out of 10, a figure consistent with pass rates reported in the aforementioned literature. Given that these year 2 students also have two more years of their degree ahead of them, it would obviously be interesting to follow them up and compare their level of achievement on the same kinds of dose calculation problems in years 3 and 4, thus providing an indication of the level they have reached by the time they transition to professional practice.

One of the errors we observed, which was also discussed by Bagnasco et al. [4], reflected students’ difficulty with converting units. In Spain, this concept is taught during secondary education (with the exception of % = mg/ml and drops – ml), and hence students should be able to perform this operation by the time they enter university. More specifically, we found that students had greater difficulty moving up the scale of units (e.g., from mcg to mg) rather than down, and also that some students correctly converted units in the clinical case exercise but not in the unit equivalences theory test, and vice-versa. Given their importance in clinical practice, greater attention needs to be paid to these concepts during students’ training.

Our analysis also showed that students were more likely to produce a correct answer when they applied a structured approach to the problem (e.g., extract all the key information from the question, use the correct units, and follow the correct calculation steps). This structured approach is taught at earlier stages of education in our country, although it may not be adequately assimilated by all students. If this is the case, then it could have a negative impact on their ability to perform medication calculations, which tend to be more complex in content than the purely arithmetic problems they will have been set during secondary education [21, 25].

Obviously, students would most likely find calculation exercises easier if they involved a single task expressed in clear and concise terms. However, this would not reflect clinical reality, because in practice a dose calculation is made in the context of a specific patient and other variables that may affect the final result must also be taken into account. Setting students contextualized case exercises therefore helps to reduce the theory-practice gap [26]. Accordingly, the exercises analyzed in this study involved clinical scenarios of varying complexity, and it was noticeable that the easier questions (those with just one or two calculation steps) were more likely to be answered correctly than were the more complex multi-step problems. This is reflected in the proportion of exercises where the student had clearly not understood the question (16.9%), which tended to be the more difficult problems involving more than one calculation [25]. In those exercises (n = 328) where it was possible to analyze whether the student had contextualized their result and checked whether it was realistic and made sense, we found that 45.5% failed to do so. In order to solve problems of this kind, students must understand precisely what they are being asked to do and extract from the question the key information they need to perform the calculation correctly [21]. In this context, Grunetti et al. [19] found that while students may find it helpful to use a calculator, this can also produce a false sense of security, such that they do not then consider whether the result makes sense or not. This kind of error is much more common among students who do not have a good grasp of the principles of mathematical calculation or who struggle with logical reasoning [27]. An example from our analysis was a student who did not question a final result giving a drug dose of 10− 6 mg, even though it should be obvious that it is impossible to administer such a small amount to a patient. In our view, clinical simulation with manikins is a highly useful tool for helping students develop their skills in this respect [27, 28], insofar as it allows them to see the practical result of their calculations (i.e., the drug volume to administer), in addition to providing them with an opportunity to improve their critical thinking [29] and to learn from mistakes and their peers [30]. It should also be noted that simulation is not a stress-free experience for students, as they will be observed and be set a time limit for performing the task, and in this respect it more closely resembles the realities of clinical practice [31]. As an alternative or complement to simulation, one might also use more active learning strategies or those in which students can see the material they need or visualize what is being explained to them, rather than it being presented in abstract or purely theoretical terms [1, 26].

Another important difficulty that students had, regardless of the course year they were in, concerned calculation of the IV infusion rate. In our view, this suggests a gap between theory and practice in this respect, because although students are taught in class how to calculate the infusion rate, they are unlikely to see clinical nurses calculate the drip rate in drops/minute as this is usually estimated by the nurse when setting up a manual IV set [32]. By contrast, students do gain practical experience of calculating the rate in ml/h, because these are the units used with infusion pumps. In accordance with Hedlund et al. [3], we also found that students had difficulties in calculations involving the total administration time or the volume of diluent, both of which are more theoretical pharmacological concepts. A task for future research would be to examine in more detail the possible relationship between students’ exposure to these kinds of calculations while on clinical placement and their performance in written examinations.

The present study has several limitations that derive from the retrospective design and the fact that the analysis is based solely on written dose calculation exercises. One is that we do not know why some exercises were left blank, that is to say, whether it was because the student did not know how to solve the problem or simply ran out of time during the exam. Neither is it clear whether the number of correct answers would have been greater if students had been allowed more time or did not feel the pressure of an exam situation. On a related issue, we have no way of knowing whether those students who gave a correct answer were confident about their calculations or got there more by luck than judgment. It would therefore be useful in future studies to complement an analysis of this kind with qualitative feedback from students themselves regarding the difficulties they experienced and their level of confidence in their answers. Our approach here also provides no insight into students’ thought process or reasoning, which would be necessary in order to understand more about why precisely they made the errors they did. This would also be an interesting topic for future research. As noted earlier, three of the exercises set during year 2 imply a knowledge level above that expected of students at this stage of their training, the rationale being that this allows us to track the progress of individual students over subsequent years of their studies. We acknowledge, however, that in the context of the present analysis the inclusion of these exercises may bias the results obtained for year 2 students, which must therefore be interpreted with caution. A final limitation to consider is that we have no comparative data among the years of nursing degree regarding the dose calculation skills. Moreover, students spend year 4 almost entirely on placement, making it difficult to schedule a classroom-based test their ability in this respect. In order to build on the present results, it would be useful to introduce a final written examination (such as that used with year 2 students) at the end of every year so as to explore the evolution of the knowledge and level that our students reach by the time they graduate.

Notwithstanding these limitations, our study also has two important strengths: One is the large number and variety of dose calculation exercises analyzed, while the other is the detailed analysis of the aspects that students find most difficult.


Nursing students have adequate skills when it comes to basic dose calculations, but struggle with more complex problems, although they tend to improve as they progress through their studies. The most common errors we observed were related to not understanding or not extracting the key information from the question, not following the correct steps in their calculations, and a lack of basic knowledge such as how to convert units. The fact that some students did not consider whether their answer was realistic and made sense in a clinical context is problematic from the point of view of safe medication management. The use of high-fidelity simulation scenarios during their training could play an important role in helping them improve their skills in this respect.

Relevance to clinical practice

In order to ensure that nursing students are proficient in medication calculation by the time they graduate, nurse educators need to identify the specific aspects and steps in the process that students find most difficult, thus enabling instruction to be targeted where it is most needed. The present analysis of students’ answers to a series of dose calculation exercises of varying levels of complexity shows that their errors cannot be attributed solely to poor calculation skills, insofar as they involved different aspects and stages of the problem-solving process. Notably, students often struggled to understand key concepts associated with dose calculation and failed to follow the correct steps when performing the exercises set. Nurse education programs must, in addition to developing students’ mathematical competence, ensure they acquire an adequate understanding of the key concepts (both numerical and clinical) underpinning medication calculation, as well as an appreciation of the importance of checking that their result is realistic and makes sense in relation to each individual patient.

Availability of data and materials

The regulations of our university that cover the use of data from students do not allow us to share these datasets publicly. However, data in the form of aggregated results are available from the corresponding author upon reasonable request.


  1. Wright K. An investigation to find strategies to improve student nurses’ maths skills. Br J Nurs. 2004;13(21):1280–7.

    Article  PubMed  Google Scholar 

  2. Ross LM. Medication errors in a paediatric teaching hospital in the UK: five years operational experience. Arch Dis Child. 2000;83(6):492–7.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  3. Hedlund N, Beer I, Hoppe-Tichy T, Trbovich P. Systematic evidence review of rates and burden of harm of intravenous admixture drug preparation errors in healthcare settings. BMJ Open. 2017;7(12):e015912.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Bagnasco A, Galaverna L, Aleo G, Grugnetti AM, Rosa F, Sasso L. Mathematical calculation skills required for drug administration in undergraduate nursing students to ensure patient safety: a descriptive study: drug calculation skills in nursing students. Nurse Educ Pract. 2016;16(1):33–9.

    Article  PubMed  Google Scholar 

  5. Elonen I, Salminen L, Brasaitė-Abromė I, Fuster P, Kukkonen P, Leino-Kilpi H, et al. Medication calculation skills of graduating nursing students within European context. J Clin Nurs. 2022;31(5-6):548–58.

    Article  PubMed  Google Scholar 

  6. Caboral-Stevens M, Ignacio RV, Newberry G. Undergraduate nursing students’ pharmacology knowledge and risk of error estimate. Nurse Educ Today. 2020;93:104540.

    Article  PubMed  Google Scholar 

  7. Williams B, Davis S. Maths anxiety and medication dosage calculation errors: a scoping review. Nurse Educ Pract. 2016;20:139–46.

    Article  PubMed  Google Scholar 

  8. Johnson J, Kareem A, White D, Ngwakongnwi EM, Mohammadpour M, Rizkika N, et al. Nursing students’ perspectives on learning math for medication calculations in a Canadian nursing program in Qatar. Nurse Educ Pract. 2020;49:102885.

    Article  CAS  PubMed  Google Scholar 

  9. Gill M, Andersen E, Hilsmann N. Best practices for teaching pharmacology to undergraduate nursing students: a systematic review of the literature. Nurse Educ Today. 2019;74:15–24.

    Article  PubMed  Google Scholar 

  10. Grugnetti AM, Bagnasco A, Rosa F, Sasso L. Effectiveness of a clinical skills workshop for drug-dosage calculation in a nursing program. Nurse Educ Today. 2014;34(4):619–24.

    Article  PubMed  Google Scholar 

  11. Karabağ Aydin A, Dinç L. Effects of web-based instruction on nursing studentsʼ arithmetical and drug dosage calculation skills. Comput Inform Nurs. 2017;35(5):262–9.

    Article  PubMed  Google Scholar 

  12. Renmarker E, Carlson E. Evaluation of Swedish nursing students’ experience of a web-based platform for drug calculation. Nurse Educ Pract. 2019;38:89–95.

    Article  PubMed  Google Scholar 

  13. Stolic S. Educational strategies aimed at improving student nurse’s medication calculation skills: a review of the research literature. Nurse Educ Pract. 2014;14(5):491–503.

    Article  PubMed  Google Scholar 

  14. Nursing and Midwifery Council. Realising professionalism: Standards for education and training Part 3: Standards for pre-registration nursing programmes. 2018. Available from:

  15. National Agency for quality assessment and accreditation (ANECA). White paper. Bachelor's degree in nursing. National Agency for Quality Assessment and Accreditation 2004:1–336. Available from:

  16. Fleming S, Brady AM, Malone AM. An evaluation of the drug calculation skills of registered nurses. Nurse Educ Pract. 2014;14(1):55–61.

    Article  PubMed  Google Scholar 

  17. Luokkamäki S, Härkänen M, Saano S, Vehviläinen-Julkunen K. Registered nurses’ medication administration skills: a systematic review. Scand J Caring Sci. 2021;35(1):37–54.

    Article  PubMed  Google Scholar 

  18. McMullan M, Jones R, Lea S. Patient safety: numerical skills and drug calculation abilities of nursing students and registered nurses. J Adv Nurs. 2010;66(4):891–9.

    Article  PubMed  Google Scholar 

  19. Grugnetti AM, Arrigoni C, Bagnasco A, Grugnetti G, Menoni S, Casey M, et al. Evaluating the effectiveness of calculator use in drug dosage calculation among Italian nursing students: a comparative study. Int J Clin Skills. 2017;11(2):57–64.

    Article  Google Scholar 

  20. Khasawneh E, Gosling C, Williams B. What impact does maths anxiety have on university students? BMC Psychol. 2021;9(1):37.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Weeks KW, Lyne P, Torrance C. Written drug dosage errors made by students: the threat to clinical effectiveness and the need for a new approach. Clin Eff Nurs. 2000;4(1):20–9.

    Article  Google Scholar 

  22. STROBE strengthening the reporting of observational studies in epidemiology [internet]. STROBE Checklist 2022 [Cited August 16th 2022]. Available from:

  23. Dilles T, Vander Stichele RR, Van Bortel L, Elseviers MM. Nursing students’ pharmacological knowledge and calculation skills. Nurse Educ Today. 2011;31(5):499–505.

    Article  PubMed  Google Scholar 

  24. Özyazıcıoğlu N, Aydın Aİ, Sürenler S, Çinar HG, Yılmaz D, Arkan B, et al. Evaluation of students’ knowledge about paediatric dosage calculations. Nurse Educ Pract. 2018;28:34–9.

    Article  PubMed  Google Scholar 

  25. McMullan M, Jones R, Lea S. Math anxiety, self-efficacy, and ability in British undergraduate nursing students. Res Nurs Health. 2012;35(2):178–86.

    Article  PubMed  Google Scholar 

  26. Weeks KW, Lyne P, Mosely L, Torrance C. The strive for clinical effectiveness in medication dosage calculation problem-solving skills: the role of constructivist learning theory in the design of a computer-based “authentic world” learning environment. Clin Eff Nurs. 2001;5(1):18–25.

    Article  Google Scholar 

  27. Jarvill M, Jenkins S, Akman O, Astroth KS, Pohl C, Jacobs PJ. Effect of simulation on nursing students’ medication administration competence. Clin Simul Nurs. 2018;14:3–7.

    Article  Google Scholar 

  28. Pettigrew J, Stunden A, McGlynn S. Contextualising numeracy skill development and assessment in a first year undergraduate nursing subject: a mixed methods research study. Nurse Educ Today. 2020;92:104426.

    Article  PubMed  Google Scholar 

  29. Shin H, Ma H, Park J, Ji ES, Kim DH. The effect of simulation courseware on critical thinking in undergraduate nursing students: multi-site pre-post study. Nurse Educ Today. 2015;35(4):537–42.

    Article  PubMed  Google Scholar 

  30. Palominos E, Levett-Jones T, Power T, Martinez-Maldonado R. Healthcare students’ perceptions and experiences of making errors in simulation: an integrative review. Nurse Educ Today. 2019;77:32–9.

    Article  PubMed  Google Scholar 

  31. Shearer JN. Anxiety, nursing students, and simulation: state of the science. J Nurs Educ. 2016;55(10):551–4.

    Article  PubMed  Google Scholar 

  32. Rooker JC, Gorard DA. Errors of intravenous fluid infusion rates in medical inpatients. Clin Med. 2007;7(5):482–5.

    Article  Google Scholar 

Download references


We would like to thank Karen Liseth Rojas Manzano for her help in the final editing of the manuscript.


This study was supported through funding from AGAUR (the Catalan Agency for the Management of University and Research Grants (2017SGR141) and the Business Chair DECIDE (UIC-Boehringer Ingelheim). Neither of the sources mentioned have been involved in the development of the study.

Author information

Authors and Affiliations



Wennberg-Capellades, L: Methodology, investigation, formal analysis, data curation, writing original draft, writing review and editing. Fuster-Linares, P: Conceptualization, methodology, investigation, writing review and editing. Rodriguez-Higueras, E: Methodology, investigation, writing review and editing. Gallart Fernández-Puebla, A: Methodology, investigation, writing review and editing. Llaurado-Serra, M: Methodology, investigation, formal analysis, data curation, writing original draft, writing review and editing, supervision. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Laia Wennberg-Capellades.

Ethics declarations

Ethical approval and consent to participate

This study was approved by the research ethics committee of the Universitat Internacional de Catalunya (UIC Barcelona) (ref. INF-2018-05). The need for informed consent was waived by the ethics committee due to the retrospective nature of the study.

Consent for publication

Not applicable.

Competing interests

The authors declare no conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wennberg-Capellades, L., Fuster-Linares, P., Rodríguez-Higueras, E. et al. Where do nursing students make mistakes when calculating drug doses? A retrospective study. BMC Nurs 21, 309 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: