Skip to main content

Open access
Research Publication

How Instructors Think About Technology in Simulation-Based Training: A Survey of Three Programs for Professional Education

Abstract

Simulation-based training has emerged as a key pedagogical method for shaping the integration of theory and skills in many professions. However, professional disciplines use simulations in various ways. This article presents data from a survey of Scandinavian simulation instructors from three different professional educational programs. The aim of this comparative investigation is to enhance our empirical knowledge about how instructors use and evaluate technology in simulation-based training. Our results show that while the instructors share an overall positive opinion about simulation, and are familiar with many technologies, the three fields make distinct choices about frequency and preferred modality. The differences and similarities highlighted in the findings provide a stepping stone for cross-disciplinary learning and in-depth studies on underlying pedagogical assumptions.

Keywords

  1. Simulation-based teaching methods
  2. technology
  3. professional bachelor education

Introduction

Simulation-based teaching methods have become ubiquitous across educational programs in many professions. Cannon-Bowers and Bowers (2009) refer to these methods as “synthetic learning environments”, as they are characterized by technology-enabled instructional environments that are actively based on representations of some relevant real-life processes or activities through simulation, games, or virtual worlds.
Simulations provide a safe environment to train on risky operations that can be difficult to get sufficient practice on in real life (Chernikova et al., 2020). The simulated environment allows the instructor1 to manipulate events and use recorded data (audio, video, or action logs) for educational purposes. Simulations can also be flexible, efficient, and cost-effective, as they can have low transition costs when accessing and changing between different scenarios, as opposed to more rigid environments for on-the-job training (Kim et al., 2021).2
In this article, we compare simulation-based training in three fields, asking how instructors in bachelor programs for professional education use and evaluate the role of technology in simulation-based teaching methods. To answer this question, we conducted a survey targeting instructors’ bachelor-level educational programs in nautical studies, biomedical laboratory science (BLS), and nursing studies in Norway, Sweden, and Denmark. Empirical insights about how instructors in different fields use and evaluate these technology-intensive and often costly pedagogical practices can be conducive for further qualitative analysis, theoretical developments, and cross-disciplinary learning. In this article, we focus our comparison on practices between, and not national differences within, professions.
Below we provide a brief introduction to simulation-based teaching methods in the three different professional education programs. Materials and methods are described, before we present data on instructors’ choice of technology, their frequency of use, opinions about what makes a good simulation exercise, and their assessments of current use and available technologies. The findings show distinct practices and important similarities, which are highlighted in the subsequent discussion. In conclusion, we suggest paths for further exploring our findings.

Simulation-Based Teaching Methods Across Three Professional Fields

The growth of simulation methods in teaching tracks the development of pedagogical principles known as “active learning”, an approach that focuses on how students learn by engaging in learning activities (Bonwell & Eison, 1991; Freeman et al., 2014), where a main goal is to minimize the “theory-practice gap” (Greenway et al., 2019). Academic fields included in this study all exemplify “protective professionalism”, requiring formal certification for practicing as a professional (Noordegraaf, 2020). In these fields, practitioners must integrate a substantial corpus of theoretical knowledge along with a range of embodied skills (the body knows how to act), whether the candidate works on a ship’s bridge, in a biomedical laboratory, or in a clinic. Accordingly, these educational programs aim to produce candidates capable of delivering “esoteric services” (Hughes, 1963): professional practices that are both manual and symbolic, in high-risk and complex domains. Historically, these three bachelor-level programs have also relied on a substantial amount of practical, on-the-job vocational training to support the students’ transitions from novices toward competent professionals (Issenberg & Scalese, 2008; Manuel, 2017). A recent orientation toward active learning methods in healthcare and maritime education can be understood in this context.
Nautical studies enable successful candidates to pursue a career as a deck officer. These programs can vary in set-up and length but must comply with international standards and specifications (IMO 2011). Here “approved relevant simulation training” is listed as a method for demonstrating competence in specific skills. In some cases, simulators may substitute for onboard training. Despite international requirements there are substantial differences between countries and institutions in the availability of simulation-equipment and their use-frequency (Jensen & Resnes, 2017). In Sweden, in contrast to Norway and Denmark, the requirement for practical sea experience to obtain certificates is integrated as a fourth year within a bachelor’s degree.
BLS trains students for laboratory analyses, usually within a few specialized domains. Laboratory training courses have mainly focused on the technical aspects of practical skills to ensure the adequate quality and reproducibility of laboratory analyses. When implemented in a BLS program, the burgeoning field of simulation-based training can help students develop their practical skills in a realistic working environment, beginning with basic technical skills such as blood sampling, microscopy, and pipetting (Björn et al., 2021; Obrien, 2013). To address the knowledge gaps of future professionals in BLS, simulation-based training can also be tailored to target specific learning goals, offering a promising development in higher education (Moro & Gregory, 2019; Obrien, 2013).
Nursing studies have long made widespread use of imitating procedures and anatomic models (Rosen, 2013). In bachelor-level programs, simulation-based training helps to prepare students for practical studies. Nursing simulation is driven by an increased demand for patient safety, more advanced medical treatments, and increased complexity in the healthcare system (Aebersold, 2016; Gaba, 2004). Despite the efficacy of simulation-based training in healthcare being well documented (La Cerra et al., 2019; McGaghie et al., 2011), more knowledge is required about variations in the adoption of technology in simulation-based nursing training. There are, for instance, notable discrepancies between ministerial orders in the Nordic countries and EU regulations about whether simulation can replace clinical hours “in direct contact with a healthy or sick individual” (Henriksen et al., 2020).

Materials and Method

Our empirical data stems from a survey targeting instructors teaching simulation-based methods across three bachelor-level studies in Norway, Sweden, and Denmark.
The survey questionnaire was organized into four sections. The first focused on the instructor’s background. The following two focused on design choices in different phases of the simulation activity, from planning to the execution and follow-up with students. In this section, we identified ten simulation modalities and asked the respondents to report which they used and how often (seven categories from daily to do not use, here presented as four categories for readability). The final section asked instructors to evaluate different claims about the use of simulation as a teaching method and factors influencing instructors’ choices regarding such methods. We asked respondents to report these questions using a Likert scale from 1 to 5 (the most negative sentiment, to a very little extent, or similar, was given the value 1).
Initially, the survey was designed in Norwegian by a cross-disciplinary group of researchers and practitioners with knowledge about simulation to ensure valid terminology and questions that worked across the fields. Next, the instrument was piloted by an instructor from each of the respective fields. The ten forms of simulation technology identified in the survey were defined through group discussions, with the instructors piloting the survey also being asked if the listed options were adequate. Identifying categories that were meaningful across professions was a challenging part of the design. The final survey was translated to Danish, Swedish, and English. Our translations were quality-checked by native speakers with relevant thematic competence. In the analysis, the four datasets (one from each language) were merged and exported to SPSS. In addition to various descriptive analyses on the dataset, we performed a Kruskal-Wallis test (KW test). Here, we compared the respondents’ professional field (dependent variable) with their attitudes toward technology use (independent variable). The null hypothesis was that there would be no group-level differences between respondents’ professional backgrounds and their responses to the specific variables being tested. The significance level for the rejection of the null hypothesis was set at 0.05. A post hoc, pairwise comparison of professional fields explored how responses differed in cases where the null hypothesis was rejected in favor of the alternative (at least one group being significantly different from the others regarding the dependent variable).

Sampling and Distribution

Nettskjema was used to design and distribute the survey.3 Overall, 181 responses were received. Due to substantial differences in size among the three professional fields, the team adopted separate strategies to identify the population.
Eight institutions offer bachelor-level education in nautical studies in Norway (4), Sweden (2), and Denmark (2). We asked the study program leaders to provide a list of relevant instructors involved in simulation-based training at their institution. We were able to compile a list of all relevant instructors and send the survey directly to each, with the entire population being 59 individuals (27 from Norway, nine from Denmark, and 23 from Sweden).
It was also possible to compile a list identifying all potential respondents within BLS. A total of 21 institutions offers BLS education across Norway (7), Sweden (8), and Denmark (6), although some institutions have more than one campus. The population of instructors in this field was 34 (13 from Norway, nine from Denmark, and 12 from Sweden).
In nursing studies, we identified 66 institutions (80 campuses) that offer full- or part-time bachelor’s degree across Norway, Sweden, and Denmark. Given the available timeframe and available resources, we chose a different strategy to identify respondents in nursing. We approached those with a role equivalent to study program directors with information about the survey, asking them to forward the survey to all their simulation instructors. Thus, we did not define the population of possible instructors, but rather possible institutions.
Table 1. Distribution of respondents by professional field, including country distribution for each field, N = 179 (2 missing, one not listing profession, and one nursing instructor not listing country).
Country
Nautical studies
Biomedical laboratory science
Nursing studies
Respondents per country (%)
Norway
23
7
33
35,2%
Denmark
5
1
47
29,6%
Sweden
15
4
44
35,2%
Total
43
12
124
100%
Table 1 displays the distribution of respondents in each field and country. Of the respondents, 64 (36%) taught in Norway, 53 (30%) in Denmark, and 63 (35%) in Sweden. The number of respondents per field varied according to the potential population. 43 (24%) of the respondents came from nautical studies (population 59, 79.2% response rate). All nautical institutions in the population are represented. A total of 125 respondents (69%) were from nursing studies. Given the recruitment strategy for this field, it was not possible to calculate the response rate. However, the survey had a broad geographical distribution with respondents from 30 different nursing departments (Sweden, 14; Norway, 10; Denmark, 16). Twelve (7%) respondents represented BLS, yielding a 35% response rate from that field (with a population of 34). Since our data suggest that BLS respondents are unfamiliar with thinking about their practical training as simulations per se, we include this sample, despite the low response rate, as observations that deserve further scrutiny. Note that the small size of the BLS sample can make percentages rather than absolute values misleading, a point worth keeping in mind as we present our results.
While 70% of our respondents are women, women only comprise 7% of the respondents from nautical studies. This reflects a general gender imbalance across these three professional fields (Wold et al., 2022). In addition to formal education, 77% of our respondents have supplementary training in using simulation as a teaching method (either credit-eligible (39%) or other form of non-certifying courses (38%).) BLS is distinct here, where only two of 12 instructors have any training in simulation-based teaching methods.

Results

Frequency and Modalities of Simulation-Based Teaching in Bachelor-Level Education

A central aim of our survey was to better understand the use of technology among instructors in simulation-based education at the bachelor level. Figures 1, 2, and 3 present the respondents’ reported use of each modality, differentiated by the following frequency of use: weekly or more often (light green), monthly (green), or once per semester or less often (blue). The white dotted bar represents do not have access to/do not use. For clarity, we present the results for each professional field in a separate figure.4
A stacked bar giving the overview of how nautical instructors use ten identified alternative forms of simulation activities. The stacked diagram has four different categories 1) weekly or more often, 2) monthly or more, 3) once per semester or less and 4) do not use/do not have access to.
Figure 1. Self-reports from instructors in nautical studies on how often they use various simulation modalities (n = 41–43), relative values (%).
Figure 1 shows that nautical studies are dominated by advanced simulator training: 72% of our sample reported using advanced simulators monthly or more often, with 60% using these technologies on a weekly or daily basis. Furthermore, the results show that laboratories or areas for skills training (47%), computer-based simulations (49%), and almost half of the nautical instructors use simulations with objects/instruments monthly or more often (44%).
Other modalities are less frequently used by instructors in nautical studies. Simulations using live markers and role-playing games without technical augmentation are used by 40% once per semester or more often. Simulations using augmented reality (AR) or virtual reality (VR) are rarely used, with only 5 and 7% using AR or VR once per semester or less.
A stacked bar giving the overview of how biomedical laboratory science instructors use ten identified alternative forms of simulation activities. The stacked diagram has four different categories 1) weekly or more often, 2) monthly or more, 3) once per semester or less and 4) do not use/do not have access to.
Figure 2. Self-reports from instructors in BLS studies on how often they use various simulation modalities (n = 12), relative values (%)
BLS (Figure 2) primarily use laboratories or designated areas for skills training, as well as simulation with instruments and objects. All respondents used laboratories and areas for skills training monthly or more often, with 58% using them weekly or more. Likewise, simulations using objects and instruments were used monthly or more by 88% and weekly by half of the respondents. Other simulation modalities frequently used by BLS instructors include pen-and-paper simulations (58%); in addition, simulations adopting cloud-based simulations or using computers are used by over 40% on a monthly or more regular basis.5
Other forms of simulation are also used, but less frequently or by a smaller group of instructors. This includes simulations using advanced simulators, where 16% use it at least once per semester, and simulations using live markers (where 25% use it once per semester or more frequently). VR and AR simulations are reportedly not used by this sample of BLS instructors.
A stacked bar giving the overview of how nursing instructors use ten identified alternative forms of simulation activities. The stacked diagram has four different categories 1) weekly or more often, 2) monthly or more, 3) once per semester or less and 4) do not use/do not have access to.
Figure 3. Self-reports from instructors in nursing studies on how often they use various simulation modalities (n = 118–123), relative values (%)
The dominant simulation modality used monthly or more frequently in nursing studies (Figure 3) is simulations using objects/instruments (65%), or laboratories/areas for skills training (57%), as well as use of advanced simulators (47%). Weekly or more frequent use was respectively 13%, 11%, and 7% for these same modalities. If the alternative once per semester or less is included, these modalities are used by over 80% of our respondents (93% for simulations using objects/instruments).
Other simulation modalities are also widely used, but less frequently: 34% draw on role-playing games without technical augmentation monthly or more often (81%, when including less frequent use). Likewise, actors or live markers are used by 22% monthly or more often, while it sums up to 67% if once per semester or less is included. The least-used options by nursing instructors were cloud-based software (28%), AR (5%), and VR (9%).

Instructors’ Evaluation of Available Simulation Technology

The survey included questions aimed at evaluating whether the instructors were content with the technology they currently had available to do simulations. When asked whether “the simulation technology makes it possible for us to realize the learning activities that we want in simulation”, instructors across the three fields are content overall (mean score of 3.74/SD 1.104). Moreover, when asked to evaluate the claim “I often find myself in a situation where I want a technology other than the one available to be able to reach defined learning outcomes” (1 = to a very small extent), the mean score was 2.16 (SD 1.096). This indicates they are overall satisfied with their options. Instructors from nautical studies appear more content with the available technology than instructors from nursing studies and BLS instructors (with the latter being the least positive in their assessments).
Besides asking how much time different instructors spend on simulation-based methods, we also asked for their assessment about whether they spend too much time on simulation.
A clustered bar showing the mean and median score within each of the three professional fields to a statement of if they spend too much time on simulation.
Figure 4. Assessments of the statement “We spend too much time on simulation in teaching” (N = 178). Standard deviations: nautical studies, 1.25; BLS studies, 0.85; nursing studies, 0.46. 1 = strongly disagree.
Figure 4 shows that most respondents disagreed with this statement. Both nautical and nursing studies strongly disagree (1), while BLS has a median of 2.

Factors That Influence the Execution and Organization of Simulation Exercises

What Contributes to Well-Executed Simulation Exercises?

We asked instructors to consider the role of three factors in a well-executed simulation exercise: technical realism (that the technical environment resembles the environment that students will encounter in working life), social realism (the task performance environment resembles the environments that students will encounter in working life), and use of anecdotes and stories (linking simulations to work-life situations to create a meaningful context for students).6
A clustered bar showing the mean and median score for the overall respondent group where they are asked to evaluate how important three identified factors are for a good exercise. The first is technical realism (that the technical equipment is similar to what they will encounter in a work setting), the second is social realism (performing the tasks in a social context similar to the environment they will encounter in working life) the third is using stories or anecdotes from the work context to create a meaningful context for the students.
Figure 5. The instructors’ assessments of three factors for a well-executed simulation exercise (N = 179-180). Standard deviations: technical realism, 0.974; social realism, 0.089; active use of stories, 0.984. 1= to a very small extent.
Figure 5 shows that our respondents assessed these factors to be roughly equally important for well-executed simulation exercises, with a mean difference of 0.12 between the active use of anecdotes/stories (highest mean, 3.94), and social realism (lowest mean, 3.82). Standard deviations were similar (from 0.974 to 0.989). A Kruskal-Wallis test to check for differences in answers depending on professional field shows that the hypothesis is rejected for technical realism as well as for the use of stories (indicating no significant group-level differences in how these are valued). Social realism yields a significant difference (p = 0.033). A post hoc, pairwise comparison yields significant results for BLS compared to the other two fields (nautical studies, p = 0.026/0.77 and nursing studies, p = 0.009/0.027). Median score for BLS was 3, compared to a median score of 4 in the two other fields.

Factors That Influence the Instructor’s Ability to Optimize Their Use of Simulation-Based Teaching

While questions in the previous section revolved around factors that instructors consider important regarding realism, the next section asked instructors to evaluate the influence of five different factors that can facilitate or hamper their ability to optimize simulation-based teaching methods. This included time, staff resources, and material conditions (such as access to rooms and equipment), as well as their own or colleagues’ competencies on simulation as a form of teaching.
A bar showing the mean score of the total respondent group, to five different factors that might influence the use of simulation- based teaching methods positively or negatively. The identified factors are teacher resources, available time, material conditions, my colleagues competence on simulation as a form of teaching and their own competence on the same matter.
Figure 6. Respondents evaluation of how the identified factors influence their ability to optimize simulation-based teaching (N = 179 – 181). 1 = very negatively.
According to Figure 6, none of these five factors appear to negatively influence instructors’ perceived opportunities to optimize simulation-based teaching. The respondents consider factors pertaining to their own and their colleagues’ competencies to affect their opportunities more positively than the latter three factors. A Kruskal-Wallis test examining the association between respondents’ answers and their professional field gives reason to reject the null hypothesis for all statements, except the factor “material conditions”. The respondent’s professional background influences how they rate the factors that can affect opportunities for optimizing the use of simulation-based teaching methods.
Both nautical and nursing studies had a median score of 4 for the factor available time, while the median for BLS instructors was 3.5. The highest median score for evaluation of teacher resources is provided by respondents from nursing studies (5), followed by nautical studies (4), and BLS (3). A pairwise analysis of time and resources yields significant results.7 Figure 7 displays our respondents’ assessments of how their own and colleagues’ competence influences their ability to optimize teaching activity.
This figure has to parts A and B, that represent two different boxplots. A displays the results for the statement about my own competence on simulation as a teaching form split between the three fields, B also shows the three fields, but in respect to their evaluation of their colleagues competence. This figure has to parts A and B, that represent two different boxplots. A displays the results for the statement about my own competence on simulation as a teaching form split between the three fields, B also shows the three fields, but in respect to their evaluation of their colleagues competence.
Figure 7. A) Boxplot of how respondents from each field evaluate the influence of their own competence on the optimization of simulation training, on a scale from one to five (N = 180). B) Boxplot of how respondents consider the influence of their colleagues’ com
Figure 7A and B show the distribution of answers for each professional field. In the boxplot, the blue bar indicates the interquartile spread for scores within each group, whereas the thick black bar represents the median value. Nursing study instructors gave a higher median score (5) for both factors compared to the two other fields. Nautical studies have a median of 4 for both statements, while BLS score 3.5 with regards to their own competence and a median of 4 on their colleagues’ competence. BLS also had a larger interquartile spread.
Conducting a pairwise comparison between the groups for 7A revealed a significant result for BLS compared to both nautical studies (p = 0.015/0.44) and nursing studies (p = 0.001/0.001). The statement in Figure 8B, also yields a significant result for BLS compared with nautical studies (p = 0.012/0.035) and nursing studies (p = 0.002/0.007).
Figure 6, as well as 7A and B, suggest that our respondents have a positive evaluation of their competence in simulation as a form of teaching. How this relates to competence in using simulation technologies was further investigated by the statement “We have sufficient expertise to utilize the simulator technology”. For all respondents, this statement had a mean score of 3.3 (SD 1.33). A KW test checking for group-level differences in responses based on professional field gave significant results (P = 0.001).
This figure has one boxplot showing the results of how the three different professional fields evaluate their own competence in the simulation technology.
Figure 8. Boxplot for statement “We have sufficient expertise to utilize the simulator technology” (N = 175).
Figure 8 displays the scores for the above statement, split by professional field. The median score is higher for nautical instructors (4) than for BLS and nursing (3). Noticeably, nursing instructors had a larger interquartile spread than the other two fields. A pairwise comparison between the three fields yielded a significant result when comparing nursing and nautical studies (p = 0.001/0.000), and BLS and nautical studies (p = 0.038/0.113).

Discussion

In this study we are concerned with two aspects of simulation-based training: as a way of doing instruction (technique), and as a set of tools to accomplish the job (technologies). Gaba (2004) underscored this important distinction in order to avoid reducing simulation-based training to mere technology. As a way of doing pedagogy, a technique, our survey shows that respondents draw on a myriad of technologies to facilitate simulation-based training for students. There are both interesting similarities and salient differences between the three fields.
A key concern relates to the simulation modalities and their use frequencies. The results suggest noticeable contrasts between the three fields, raising questions about potential differences in pedagogical assumptions across these professional educations. Our results show a clear difference in frequency of use: nautical studies and BLS instructors take their students into simulation exercises more often than nursing study instructors do.
The two fields that most frequently adopt simulation-based training rely on different technological modalities. Nautical study instructors most often use advanced simulators, while BLS instructors rely on laboratories and areas of skill training, in addition to objects or instruments for skills training. More in-depth studies are needed to explore what this difference in modality signifies. Our study shows that BLS instructors give less weight to the dimension of social realism in simulation training than the two other fields. A question for future research is how choices about technological modality relate to pedagogical ideas regarding the role of skills training or more complex scenarios, respectively.
Nursing draws on all the listed modalities, but instructors use simulations less frequently than in nautical studies and BLS. An obvious factor that may account for this difference is the larger cohort of students in nursing compared to nautical studies and BLS. Consequently, simulation-based training in nursing studies may require more resources and time to conduct. Still, when our respondents are asked to evaluate how various factors like time, teacher resources, and material conditions affect their abilities to optimize teaching, they are quite positive across the board (3.82–3.96). Moreover, the median score for nursing instructors was higher (5) than for the other two fields. This is somewhat surprising given the observed differences in use-frequency (and considering differences in class size). Use patterns in nursing education might therefore involve a bootstrapping type of situation, where instructors consider the current level of simulation-based training feasible given the size of student groups and available resources. Further analysis of the practical implications of these differences is needed in order to better understand the underlying pedagogical assumptions and drivers. Another possible explanation for differences in use-frequency is that nursing studies have considerably more work practice in the curriculum, compared to the other two fields. According to this logic, we should see differences between countries depending on if they integrate the practice period in the bachelor. Our data does not support this inference, and further studies that specifically look at the role of integrated practice are encouraged.
Another notable result is the vast use of different forms of simulations across the three professional fields. Of the ten simulation modalities listed in our survey, eight were used in all three education programs. An intriguing exception was AR and VR, which had no reported users in BLS and were rarely used by instructors from nautical and nursing studies. Instructors do not seem to experience this as a problem, as indicated by their responses to the question of whether they are satisfied with current technological options. Alternatively, instructors do not have sufficient knowledge about VR and AR as simulation technologies to know what they are missing out on. These results show that instructors from all three fields generally appear to be content with the pedagogical opportunities afforded by their simulation technology and the available options for technology.
In general, instructors do not agree with the statement that they spend too much time on simulation and appear to value it as a teaching method. Here, instructors from both nursing and nautical studies strongly disagree (median score 1), which suggests that they would like to spend more time simulating. Our results also suggest that instructors from all three professions consider successful simulation exercises to substantially reproduce both the technical and interactional aspects of the real-world environments where students will conduct their work. Particularly interesting is the degree to which instructors emphasize the significance of stories and anecdotes in creating a meaningful experiential context for their students.
Our sample of instructors appears confident in their ability to conduct simulation-based teaching. The survey included three different statements asking them to evaluate their competence in simulation-based methods: two asking instructors to evaluate their own and colleagues’ competence in simulation as a teaching method, and one asking about their expertise in utilizing the simulation technology. These statements are interesting to examine together. Among nautical instructors, all three statements were scored as equally important. BLS instructors have the most confidence in their colleagues’ competence (4), followed by their own competence in the teaching methods (3.5) and the least confidence in their knowledge of the technology (3). Most notable is the score among the nursing study instructors. Nursing instructors report high confidence in their own and colleagues’ competencies about the teaching method (5), with a remarkably lower confidence in their own expertise about simulation technology (3). These results suggest that high confidence in competence for simulation-based teaching is not primarily due to confidence in mastery of the simulation technology, but other factors. Further studies are needed to explore the observed difference between nautical studies and nursing instructors. A place to start might be the observed differences in use-frequency and gender composition in the two fields.
Comparing three distinct professional fields in this manner introduces both strengths and weaknesses to the present study. One limitation pertains to the population under study, as the available time and resources necessitated distinct recruitment strategies across the fields. Although control over the entire nursing population was unattainable, clear selection criteria were employed to address this limitation. Also, surveying across three professional fields with different terminologies led to compromises in question design. Moreover, our comparative angle required a pragmatic stance, leading us to consolidate important nuances, like those between skills-based and scenario-based training forms, into a single category of simulation. This choice originates from an acknowledgment that in-depth qualitative inquiry is needed into how the various disciplines define what these concepts entail, as well as how they use the various modalities for these forms of training. However, these cross-disciplinary comprises are also a strength as they enabled a unique perspective and an empirical base for further study, both empirical and theoretical.

Conclusion

This article provides an empirical state of the art on the use and evaluation of technology facilitated simulation-based teaching methods among Scandinavian instructors in three professional fields. Our findings reveal a general positive sentiment toward simulation technology, as most instructors are quite happy with their existing options and their ability to employ them effectively. However, the three fields differ in terms of use-frequency and choice of technological modality. While this study incorporates relevant variables to potentially account for these discrepancies, more research is needed to give a satisfactory explanation of why these differences exist. For instance, how do these observed differences relate to epistemic views about methods, professional practice, and technology as mediating tools for training? Exploring these findings further, in dialogue with both in-depth qualitative studies of practice and the simulation literature, might further elaborate on how synthetic learning environments are operationalized in different professional environments. This endeavor has the potential not only to shed light on similarities and distinct approaches taken but also to stimulate cross-disciplinary debate about the proper role of technology in simulation-based training.

Acknowledgments

The research is funded by the Norwegian Research Council #316212. Thanks to all participants as well as to our wider project team for productive feedback on the design and interpretation of the results.

Footnotes

1
We have chosen to use the term ‘instructor’ in this paper but recognize that the three fields use different terms for the supervising teacher/facilitator in the simulation, each having different pedagogical connotations.
2
Of course, there are reasons to nuance this positive outlook, which is also an important issue in theoretical debates about simulations, technology, and education.
4
With respect to differences between countries within professions, only two variables yielded significant results in correlational analysis: simulations using actors/live markers and with the help of laboratory/areas of skill training produced significant result in both nautical studies and nursing. For nautical studies, Norwegian instructors appear to use these forms of simulation more frequently than the Swedes and Danes. For nursing, these results indicate that Swedish instructors use these two modalities less frequently than the others.
5
Two approaches are common for digital simulations in BLS. One involves the instrument operating system necessary to perform most analytical simulations. This OS usually runs on a regular computer that is integrated with the instrument. The other approach involves serious games using cloud-based software for training analytical techniques and test interpretation. BLS instructors will likely distinguish between those two in the questionnaire.
6
The research team has drawn on both practitioners’ experiences and the theoretical debate about simulation-based training to define the categories.
7
For time the results are p = 0.031/0.093 for BLS vs. nursing studies. For teacher resources the results are p = 0.043/0.130 for nautical studies vs. nursing studies, and p = 0.030/0.091 for BLS vs. nursing studies.

References

Aebersold, M. (2016). The history of simulation and its impact on the future. AACN advanced critical care, 27(1), 56–61. https://doi.org/10.4037/aacnacc2016436
Björn, M. H., Ravyse, W., Botha-Ravyse, C., Laurila, J. M., & Keinonen, T. (2021). A Revised Pedagogy Model for Simulator-Based Training with Biomedical Laboratory Science Students. Education Sciences, 11(7), 328. https://doi.org/10.3390/educsci11070328
Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom. 1991 ASHE-ERIC higher education reports. ERIC.
Cannon-Bowers, J., & Bowers, C. (2009). Synthetic learning environments: On developing a science of simulation, games, and virtual worlds for training. In Learning, training, and development in organizations (pp. 250–282). Routledge.
Chernikova, O., Heitzmann, N., Stadler, M., Holzberger, D., Seidel, T., & Fischer, F. (2020). Simulation-Based Learning in Higher Education: A Meta-Analysis. Review of Educational Research, 90, 499–541. https://doi.org/10.3102/0034654320933544
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the national academy of sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111
Gaba, D. M. (2004). The future vision of simulation in health care. BMJ Quality & Safety, 13(suppl 1), i2–i10. https://doi.org/10.1136/qhc.13.suppl_1.i2
Greenway, K., Butt, G., & Walthall, H. (2019). What is a theory-practice gap? An exploration of the concept. Nurse education in practice, 34, 1–6. https://doi.org/10.1016/j.nepr.2018.10.005
Henriksen, J., Löfmark, A., Wallinvirta, E., Gunnarsdóttir, Þ. J., & Slettebø, Å. (2020). European Union directives and clinical practice in nursing education in the Nordic countries. Nordic journal of nursing research, 40(1), 3–5. https://doi.org/10.1177/2057158519857045
Hughes, E. C. (1963). Professions. Daedalus, 655–668.
Issenberg, S. B., & Scalese, R. J. (2008). Simulation in health care education. Perspectives in biology and medicine, 51(1), 31–46. https://doi.org/10.1353/pbm.2008.0004
Jensen, T. E., & Resnes, T. (2017). Comparing nautical BSc programs by quality indicators. 19th Annual General Assembly. International association of maritime universities proceedings «Time for action: a new thrust for the future of MET and research». Barcelona School of Nautical Studies of the Universitat Politécnica de.
Kim, T.-e., Sharma, A., Bustgaard, M., Gyldensten, W. C., Nymoen, O. K., Tusher, H. M., & Nazir, S. (2021). The continuum of simulator-based maritime training and education. WMU Journal of Maritime Affairs, 20(2), 135–150. https://doi.org/10.1007/s13437-021-00242-2
La Cerra, C., Dante, A., Caponnetto, V., Franconi, I., Gaxhja, E., Petrucci, C., Alfes, C. M., & Lancia, L. (2019). Effects of high-fidelity simulation based on life-threatening clinical condition scenarios on learning outcomes of undergraduate and postgraduate nursing students: a systematic review and meta-analysis. BMJ open, 9(2), e025306. https://doi.org/10.1136/bmjopen-2018-025306
Manuel, M. E. (2017). Vocational and academic approaches to maritime education and training (MET): Trends, challenges and opportunities. WMU Journal of Maritime Affairs, 16, 473–483. https://doi.org/10.1007/s13437-017-0130-3
McGaghie, W. C., Issenberg, S. B., Cohen, E. R., Barsuk, J. H., & Wayne, D. B. (2011). Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Academic medicine, 86(6), 706–711. https://doi.org/10.1097/ACM.0b013e318217e119
Moro, C., & Gregory, S. (2019). Utilising anatomical and physiological visualisations to enhance the face-to-face student learning experience in biomedical sciences and medicine. Biomedical Visualisation: Volume 3, 41–48. https://doi.org/10.1007/978-3-030-19385-0_3
Noordegraaf, M. (2020). Protective or connective professionalism? How connected professionals can (still) act as autonomous and authoritative experts. Journal of professions and organization, 7(2), 205–223. https://doi.org/10.1093/jpo/joaa011
Obrien, M. (2013). Bridging The Gap Between Theory And Practice: Pedagogical Innovations In The Biomedical Science Classroom That Engage Students In Active Learning And Enhance Both Teaching And Learning. INTED2013 Proceedings,
Rosen, K. (2013). The history of simulation. The comprehensive textbook of healthcare simulation, 5–49. https://doi.org/10.1007/978-1-4614-5993-4
Wold, L. K., Akin, D., Bern, A., Lauritzen, T., Tholstrup, L. M., & Alemayehu, F. K. (2022). Likestilling og mangfold i maritim næring og utdanning-en kartlegging.

Information & Authors

Information

Published In

Go to issue
Volume 47Number 112 March 2024
Pages: 317

History

Published online: 12 March 2024
Issue date: 12 March 2024

Authors

Affiliations

Marte Fanneløb Giskeødegård [email protected]
Associate Professor
Department of Ocean Operations and Civil Engineering, Norwegian University of Science and Technology (NTNU)
Associate Professor
Department of Health Sciences in Ålesund, NTNU
Ann-Kristin Tveten [email protected]
Associate Professor
Department of Biological Sciences, NTNU
Mads Solberg [email protected]
Associate Professor
Department of Health Sciences in Ålesund

Declaration of interest

The authors report there are no competing interests to declare.

Metrics & Citations

Metrics

Citations

Export citation

Select the format you want to export the citations of this publication.

Crossref Cited-by

  • The role of digital technology in facilitating epistemic fluency in professional education, Professional Development in Education.
  • Reconceptualizing Simulations: Epistemic Objects and Epistemic Practices in Professional Education, Philosophy & Technology.
  • 15. Utvikling av samarbeidskompetanse gjennom simuleringer i grunnleggende profesjonsutdanninger, Samarbeid.

View Options

View options

PDF

Download PDF

Restore guest purchases

Enter your email address to restore your content access:

Note: This functionality works only for purchases done as a guest. If you already have an account, log in to access the content to which you are entitled.

Figures

Tables

Share

Share

Share the article link

Share on social media

Share on Messenger