Hostname: page-component-cd9895bd7-8ctnn Total loading time: 0 Render date: 2024-12-23T11:29:10.836Z Has data issue: false hasContentIssue false

Evaluating VERT as a radiotherapy plan evaluation tool: comparison with treatment planning software

Published online by Cambridge University Press:  25 October 2019

P. Bridge*
Affiliation:
School of Health Sciences, University of Liverpool, Liverpool L69 3GB, UK
M. C. Kirby
Affiliation:
School of Health Sciences, University of Liverpool, Liverpool L69 3GB, UK
J. A. Callender
Affiliation:
School of Health Sciences, University of Liverpool, Liverpool L69 3GB, UK
*
Author for correspondence: Dr Pete Bridge, University of Liverpool, Brownlow Hill, Liverpool L69 3GB, UK. Tel: +44(0)1517958366. E-mail: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Introduction:

The virtual environment for radiotherapy training (VERT) helps students to gain technical skills and understanding of 3D anatomy and dosimetry. It has potential as a tool for treatment plan evaluation, although little formal evidence currently supports this.

Aim:

This paper reports findings from a plan evaluation workshop that facilitated comparison of VERT plan evaluation tools with those provided by conventional treatment planning software (TPS).

Method:

Students on a pre-registration Post-Graduate Diploma in Radiotherapy worked in small groups evaluating lung plans using both VERT and Eclipse TPS tools. All students were invited to provide ratings concerning how helpful each modality was for a range of evaluation parameters and preferences for use.

Results:

Most students (11 out of 14) found the session useful and expressed a desire to use VERT in future plan evaluation. The TPS was perceived to be more helpful with constraint-based evaluation while VERT was more helpful with evaluating plans for clinical set-up and delivery (p < 0·001).

Conclusion:

Student therapeutic radiographers found VERT to be helpful as a plan evaluation tool alongside standard TPS tools, in particular for clinical set-up and delivery aspects of planning. Future work is ongoing to identify the specific impact of VERT as a plan evaluation tool for both students and qualified planners.

Type
Original Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s) 2019. Published by Cambridge University Press

Introduction

Radiotherapy planning practical experience is an integral aspect of pre-registration training. The knowledge and skills necessary to produce a clinically acceptable plan are vital preparation for both clinical treatment planning and delivery, especially for more complex, dynamic and adaptive techniques. In addition, from an educational perspective, treatment planning offers a useful format for integrating student understanding of anatomy, oncology, technique and radiobiology as well as instilling holistic patient-focused practice, despite planning and treatment sometimes being viewed in clinical practice as separate entities. Aside from practical skills assessment, students are frequently assessed on their ability to evaluate a radiotherapy plan against accepted dose targets and constraints. Students are able to utilise the tools provided by conventional treatment planning software (TPS) including dose–volume histograms (DVH), conformity IndexReference Feuvret, Noël, Mazeron and Bey1 and automated planning metric reports.Reference Bridge, Warren and Pagett2

Since its introduction to radiotherapy education in 2007,Reference Bridge, Appleyard, Ward, Phillips and Beavis3 the virtual environment for radiotherapy training (VERT) 3D visualisation platform has become an increasingly useful teaching resource for therapeutic radiography students,Reference Kane4Reference Kirby10 medical physics students,Reference Jimenez, Ronn Hansen, Juneja and Thwaites11, Reference Jimenez, Thwaites, Juneja and Lewis12 radiotherapy staffReference James and Dumbleton13 and also radiotherapy patients.Reference Stewart-Lord, Brown, Noor, Cook and Jallow14 This simulation software is used around the worldReference Stewart-Lord15, Reference Bridge, Giles, Williams, Boejen, Appleyard and Kirby16 to enable students to gain technical skills in a safe environment and to visualise patient anatomy, contoured volumes and dose distribution in large-screen 3D.Reference Kane4 The use of VERT as an aid to teaching treatment plan comparison and evaluation was discussed in a recent publicationReference Chamunyonga, Burbery, Caldwell, Rutledge, Fielding and Crowe5 but as yet there has been no published data relating to this. This project aimed to evaluate the potential role of VERT in a radiotherapy plan evaluation workshop through a comparison with the conventional tools provided by a leading TPS solution.

Methods

All 24 students on a pre-registration Post-Graduate Diploma in Radiotherapy course attended a 3-hour workshop that provided them with plan evaluation experience. The aim of the session was to provide them with useful feedback that they could utilise in their summative assessments. The workshop presented students with three radical lung plans for the same patient dataset as their assessment. The plans comprised a conventional conformal plan, a static gantry intensity-modulated radiotherapy plan and a volumetric modulated arc therapy plan for comparison. Students were split into groups of five, with an experienced tutor on hand for individual and group guidance. They were asked to use both the ECLIPSE TPS software (Varian Medical Systems, Palo Alto, CA, USA) and VERT version 3.2 (Vertual Ltd, Hull, England) to help with their plan evaluation and comparison. Each evaluation session took between 45 minutes and 1 hour with the order of evaluation tool randomised. All students had previously undertaken at least 20 hours of tutor-guided practical planning with Eclipse within two module assignments, yet had little experience of using VERT other than in treatment set-up simulation. Guidance was therefore provided via both tutor demonstration and provision of written information regarding the plan visualisation functions within VERT. This included interactive 3D visualisation of different machines, plans, dose distributions, contours and surfaces. Students were guided to display dose on orthogonal CT surfaces and experiment with different transparencies, dose levels and colour maps. They were also shown how to benefit from different viewpoints using the 3D navigation, pan, zoom and rotation functions. Beam visualisation and animation were demonstrated to help students to visualise delivery. In order to facilitate independent learning, students were encouraged to experiment with the software with tutors on hand to provide assistance if required.

After the session, all students were invited to provide feedback on their experience via an anonymous online survey (Survey Monkey TM). Rating questions used a 0–9 scale to gather data concerning how helpful each modality was for a range of evaluation tasks and objectives as seen in Table 1. Additional Likert style questions sought feedback concerning preferences for use of each modality as seen in Table 2. Finally open questions encouraged further description of the perceived value of the technology and its use in radiotherapy planning education.

Table 1. Relative usefulness of evaluation modalities

Bold values represent the majority response.

Table 2. Preferred formats of evaluation modalities

Bold values represent the majority response.

Rating responses were subjected to inferential statistical analysis with paired t-tests comparing whole cohort ratings of each modality. Independent t-tests also compared perceptions between groups using each modality in different orders. Descriptive statistics were used to summarise the Likert responses. Responses to open questions were analysed using thematic analysis with responses coded and collated into themes. Blind coding was performed by two independent researchers before themes were agreed.

University Research Ethics Committee approval was provided for this project. All students received information about the evaluation project and were advised that participation in data collection was voluntary and that all data were anonymous. It was also explained that participation status would not be known to the teaching team and would not affect student performance, support or opportunities. Informed consent was sought in relation to use of the survey data for evaluation purposes.

Results

Of the 24 students, 14 completed the online survey. Most students (13 out of 14) enjoyed the plan evaluation session and expressed a desire to use VERT as an additional plan evaluation tool in the future. Most students (11 out of 14) found the session to be useful. Students were asked to rate the extent to which the two modalities helped them to evaluate their plans for a range of plan evaluation parameters. Table 1 shows the mean scores ranging from 0 to 9 for usefulness while Table 2 shows their choice of preferred format converted into ‘1’s and ‘0’s to indicate their choices.

Table 3 summarises the inferential analysis for the statistically significant data. Following testing for normality, for constraint evaluation, a paired t-test across all 14 students for all 5 constraint questions (70 datasets) demonstrated a mean increase of 3 points in favour of ECLIPSE in terms of helpfulness compared to VERT (p < 0·001). In addition, a paired t-test across the ease of set-up and delivery domains demonstrated a mean difference of 2·3 across 28 datasets (14 students comparing each modality) in favour of VERT usefulness when evaluating ease of set-up and delivery (p < 0·001).

Table 3. Comparison of means t-test results

An independent t-test was performed between the group that accessed VERT or ECLIPSE first to identify any differences in their perception of the usefulness of VERT dependent upon the order of evaluation. This showed a statistically significant difference (p = 0·001) in student scores of usefulness for evaluation of constraints in favour of VERT for the group that used ECLIPSE first (mean score of 6) compared to those who used VERT first (mean score of 4·3). There was no statistically significant difference in perception of ECLIPSE usefulness between the groups.

Student comments were collated into themes relating to which tools within VERT they found the most useful (Table 4) and what additional tool or functionality would have helped within VERT (Table 5) and ECLIPSE (Table 6). The final question challenged students to identify the role that VERT could play in plan evaluation as seen in Table 7.

Table 4. Most useful plan evaluation tools within VERT

Table 5. Desired additional plan evaluation tools for VERT

Table 6. Desired additional plan evaluation tools for ECLIPSE

Table 7. Future role of VERT in plan evaluation

Discussion

Role of VERT in plan evaluation

It was clear that overall the students perceived VERT to be a helpful tool for some key aspects of plan evaluation. The visualisation aspects in particular helped students to evaluate clinical delivery and set-up factors that could impact on plan viability. Comments indicated that being able to see the actual machine deliver the plan helped students to understand clinical delivery issues. Most of the students recommended that VERT should play a role in plan evaluation. Table 7 includes comments identifying the value of VERT for evaluating set-up and delivery where it was seen as a particularly important addition to TPS-based evaluation. These findings confirmed published predictionsReference Kane4, Reference Chamunyonga, Burbery, Caldwell, Rutledge, Fielding and Crowe5 concerning the potential value of VERT for plan evaluation. Interestingly, the students that used Eclipse first as an evaluation tool all stated that their preferred method for evaluating each dose constraint or objective was to use both systems together, whereas the majority of those students who used VERT first suggested that Eclipse alone was their preferred format for assessment. In rating the value of each modality for assessing individual dose constraints, the value placed on Eclipse was similar for both groups while there was a higher reported value of VERT from the group using it second. It was clear that, despite the relative inexperience of the students with VERT, they had all managed to access the necessary functions. This suggests that the software functionality is intuitive and training requirements minimal. It was also interesting to note that VERT was perceived to be useful when evaluating dosimetric factors such as target and organ at risk (OAR) doses which is primarily the remit of a TPS, although the mean value for this functionality overall was significantly lower overall than when using Eclipse (p = 0·001). There was a clear acknowledgement, however, that while VERT helped to provide a useful overview of the plan and potential issues, a TPS was essential for formal plan evaluation. When students were asked what additional features would have made VERT more useful for this task, DVH depiction was a common request. This would certainly have provided more insight into the formal achievement of dose–volume constraints and target objectives.17

Educational role of VERT

In addition to helping with the plan evaluation exercise, students also suggested that VERT would have provided useful insight during their treatment planning teaching sessions. The visualisation of dose to OAR structures using intuitive mouse controls to change perspective, rotate and zoom was felt to be particularly useful, as seen in Table 4. The ability to visualise the 3D dose and volume relationship in VERT provided visual feedback on choice of beam angles, dose homogeneity and volumes of over or under-dosage. In this study, the students learned about the impact of these factors on plan viability and were able to include this in their assessed written evaluations. It was interesting to see students discover additional useful functionality in VERT, for example visualising all beams at once on the VERT plan. It would be instructive to repeat this exercise with a larger sample of students as an interim plan evaluation tool and to measure what changes, if any, are made to plans as a result of visualisation in VERT. Inclusion of quantitative analysis of performance would also provide useful insight into the specific impact of VERT on plan evaluation. The findings from this study will be implemented locally with VERT-based plan evaluation being embedded in the curriculum for all relevant cohorts.

Limitations

The small scale of the study should be acknowledged as a limitation and future planned collaborative work will aim to increase the sample size by including data from multiple institutions. It should also be acknowledged that these data were provided by student radiographers and that experienced planners would perhaps have a better instinctive understanding of 3D positioning of fields and dose deposition. It would be interesting to repeat this study with experienced planners to gain their perspective on the specific value of VERT for clinical plan evaluation. Conversely, it would also be valuable to compare the experience of first-time users provided with identical training and guidance.

Conclusions

This small study has shown that student radiographers found value in using VERT for plan evaluation alongside standard TPS tools. The ability to visualise structures, dose and beam delivery in 3D provided students with increased understanding of the clinical set-up and delivery aspects of planning. Comments from the students also suggested that VERT should be used more frequently throughout their planning modules to enhance their understanding of dosimetric principles and relational CT anatomy. Future work is ongoing to identify the specific impact of VERT as a plan evaluation tool for both students and qualified planners.

References

Feuvret, L, Noël, G, Mazeron, JJ, Bey, P.Conformity index: a review. Int J Radiat Oncol Biol Phys 2006; 64: 333342.CrossRefGoogle ScholarPubMed
Bridge, P, Warren, M, Pagett, M.Use of planning metrics software for automated feedback to radiotherapy students. J Radiother Pract 2016; 15 (4): 385391.CrossRefGoogle Scholar
Bridge, P, Appleyard, R, Ward, J, Phillips, R, Beavis, A.The development and evaluation of a virtual radiotherapy treatment machine using an immersive visualisation environment. Comput Educa 2007; 49 (2): 481494.CrossRefGoogle Scholar
Kane, P.Simulation-based education: a narrative review of the use of VERT in radiation therapy education. J Med Radiat Sci 2018; 65 (2): 131136.CrossRefGoogle ScholarPubMed
Chamunyonga, C, Burbery, J, Caldwell, P, Rutledge, P, Fielding, A, Crowe, S.Utilising the virtual environment for radiotherapy training system to support undergraduate teaching of IMRT, VMAT, DCAT treatment planning, and QA concepts. J Med Imag Radiat Sci 2018; 49 (1): 3138.CrossRefGoogle ScholarPubMed
Boejen, A, Beavis, A, Nielsen, Ket al.Training of radiation therapists using a 3D virtual environment. Radiother Oncol 2007; 84: S275.Google Scholar
Phillips, R, Ward, JW, Page, Let al.Virtual reality training for radiotherapy becomes a reality. St Heal T 2008; 132: 366371.Google ScholarPubMed
Beavis, A, Ward, J.The development of a virtual reality dosimetry training platform for physics training. Med Phys 2012; 39 (6): 3969.CrossRefGoogle ScholarPubMed
Kirby, MC.Teaching radiotherapy physics concepts using simulation: experience with student radiographers in Liverpool, UK. Med Phys Int 2015; 3 (2): 8793.Google Scholar
Kirby, MC.The VERT physics environment for teaching radiotherapy physics concepts – update of four years’ experience. Med Phys Int 2018; 6 (2): 247254.Google Scholar
Jimenez, Y, Ronn Hansen, C, Juneja, P, Thwaites, DI.Successful implementation of virtual environment for radiotherapy training (VERT) in medical physics education: the University of Sydney’s initial experience and recommendations. Australas Phys Eng Sci Med 2017; 40: 909916.CrossRefGoogle ScholarPubMed
Jimenez, Y, Thwaites, DI, Juneja, P, Lewis, SJ.Interprofessional education: evaluation of a radiation therapy and medical physics student simulation workshop. J Med Radiat Sci 2018; 65: 106113.CrossRefGoogle ScholarPubMed
James, S, Dumbleton, C.An evaluation of the utilisation of the virtual environment for radiotherapy training (VERT) in clinical radiotherapy centres across the UK. Radiography 2013; 19 (2): 142150.CrossRefGoogle Scholar
Stewart-Lord, A, Brown, M, Noor, S, Cook, J, Jallow, O.The utilisation of virtual images in patient information giving sessions for prostate cancer patients prior to radiotherapy. Radiography 2016; 22: 269273.CrossRefGoogle Scholar
Stewart-Lord, A.From education to research: a journey of utilising virtual training. J Radiother Pract 2016; 15 (1): 5890.CrossRefGoogle Scholar
Bridge, P, Giles, E, Williams, A, Boejen, A, Appleyard, R, Kirby, M.International audit of virtual environment for radiotherapy training usage. J Radiother Pract 2017; 16: 375382.CrossRefGoogle Scholar
ICRU Report 83. Prescribing, recording, and reporting intensity-modulated photon-beam therapy (IMRT). International Commission on Radiation Units and Measurements, Bethesda, MD, 2010.Google Scholar
Figure 0

Table 1. Relative usefulness of evaluation modalities

Figure 1

Table 2. Preferred formats of evaluation modalities

Figure 2

Table 3. Comparison of means t-test results

Figure 3

Table 4. Most useful plan evaluation tools within VERT

Figure 4

Table 5. Desired additional plan evaluation tools for VERT

Figure 5

Table 6. Desired additional plan evaluation tools for ECLIPSE

Figure 6

Table 7. Future role of VERT in plan evaluation