Brought to you by:
Note

Smartphone application for mechanical quality assurance of medical linear accelerators

, , , , , , and

Published 10 May 2017 © 2017 Institute of Physics and Engineering in Medicine
, , Citation Hwiyoung Kim et al 2017 Phys. Med. Biol. 62 N257 DOI 10.1088/1361-6560/aa67d5

0031-9155/62/11/N257

Abstract

Mechanical quality assurance (QA) of medical linear accelerators consists of time-consuming and human-error-prone procedures. We developed a smartphone application system for mechanical QA. The system consists of two smartphones: one attached to a gantry for obtaining real-time information on the mechanical parameters of the medical linear accelerator, and another displaying real-time information via a Bluetooth connection with the former. Motion sensors embedded in the smartphone were used to measure gantry and collimator rotations. Images taken by the smartphone's high-resolution camera were processed to evaluate accuracies of jaw-positioning, crosshair centering and source-to-surface distance (SSD). The application was developed using Android software development kit and OpenCV library. The accuracy and precision of the system was validated against an optical rotation stage and digital calipers, prior to routine QA measurements of five medical linear accelerators. The system accuracy and precision in measuring angles and lengths were determined to be 0.05  ±  0.05° and 0.25  ±  0.14 mm, respectively. The mean absolute errors (MAEs) in QA measurements of gantry and collimator rotation were 0.05  ±  0.04° and 0.05  ±  0.04°, respectively. The MAE in QA measurements of light field was 0.39  ±  0.36 mm. The MAEs in QA measurements of crosshair centering and SSD were 0.40  ±  0.35 mm and 0.41  ±  0.32 mm, respectively. In conclusion, most routine mechanical QA procedures could be performed using the smartphone application system with improved precision and within a shorter time-frame, while eliminating potential human errors.

Export citation and abstract BibTeX RIS

1. Introduction

Mechanical misalignment of medical linear accelerators (LINACs) may result in significant dosimetric deviation from planned values (Xing et al 2000). During treatment, the mechanical accuracy of LINACs should be maintained within given tolerances. Therapists only rely on LINAC digital indicators to confirm this consistency, assuming that all indicators were fully validated prior to the treatment. Therefore, for example, Institute of Physics and Engineering in Medicine (IPEM; no. 81) and American Association of Physicists in Medicine (AAPM; TG-142) report recommended periodic mechanical quality assurance (QA) procedures and individual tolerance values (Mayles et al 1999, Klein et al 2009).

These mechanical QA procedures are human-error-prone and inefficient (Welsh et al 2002) because they are performed manually, e.g. by checking the gantry/collimator rotation angle with the naked eye using a level attached to the gantry. Therefore, various automated mechanical QA programs and methods have been reported to reduce human errors and the burden on medical physicists (Arjomandy and Altschuler 2000, Low et al 2002, Li et al 2003, Poppe et al 2003, Casar et al 2009, Du and Yang 2009). Most attempts to automate QA procedures utilize x-ray based systems, such as electronic portal imaging devices (EPIDs) or radiochromic films (Welsh et al 2000, Winkler et al 2003, Adamson and Wu 2012, Rowshanfarzad et al 2012a, 2012b, 2014, Nyflot et al 2014). These kinds of customized devices worked well in practice, but were only valid for some specific QA items.

A smartphone is a representative example of condensed technology. It offers a variety of methods to access information through a high-resolution camera and various sensitive sensors. These are unique features that distinguish smartphones from other computer devices. Consequently, smartphones have been widely used for healthcare applications (Burdette et al 2008, Hussain et al 2015), such as clinical decision support systems (Martinez-Perez et al 2014, Tian et al 2014, de la Torre-Diez et al 2015), patient monitoring systems (Chandrasekaran et al 2013, McManus et al 2013) and surgery assistance (Warnock 2012). In this work, we developed a smartphone-application system for medical LINACs. To the best of our knowledge, this was the first smartphone application for comprehensive mechanical QA items. The system overview is described in section 2.1. In sections 2.2 and 2.3, we describe two implemented modules, i.e. motion sensor signal processing and image processing. The system was validated against an optical rotation stage and digital calipers to determine accuracy and precision (sections 2.4 and 3.1) prior to the mechanical QA measurements (sections 2.5 and 3.2).

2. Materials and methods

2.1. System overview

As recommended in the IPEM no. 81 and AAPM TG-142 reports, mechanical QA consists of various procedures to maintain geometric consistency regarding gantry and collimator rotation angles, jaw-positioning, crosshair-centering and source-to-surface distance (SSD) using an optical distance indicator (ODI; Mayles et al 1999, Klein et al 2009). With motion sensors and a high-resolution camera embedded in a smartphone, a smartphone-application system was developed to eliminate possible human errors or time-consuming manual methods. The system hardware consists of two smartphones; one (device) attached to the gantry to obtain real-time information on the mechanical parameters of the medical linear accelerator, and the other (displayer) to display real-time information via a wireless network (specifically, Bluetooth) with the device. The system software consists of two main modules: (1) a motion sensor signal processing module; (2) an optical image processing module. The application was developed with Android software development kit (SDK) and OpenCV library. In this study, a Galaxy note 3 (Samsung electronics Co. Ltd, Korea) smartphone was used. Figure 1 shows a schematic of the system.

Figure 1.

Figure 1. Schematic of smartphone-application system. The system consisted of two main modules. One is the motion sensor signal processing module used to measure the gantry and collimator rotation angles by using various motion sensors embedded in a smartphone. The other is the optical image processing module used to measure the jaw-position, crosshair-centering and SSD by using a high-resolution camera embedded in the smartphone.

Standard image High-resolution image

2.1.1. Motion sensor signal processing module.

Three motion sensors (gyroscope, accelerator and magnetic field sensor) embedded in the smartphone (device) were used to measure gantry and collimator angles. As shown in figure 2, the roll rotation axis of the device is perpendicular to the central axis (CAX) of the medical linear accelerator. Hence, the rotation angle of the smartphone refers to that of the gantry or the collimator. Combined use of three different sensors (known as the sensor fusion method) enhanced the accuracy of determining the orientation of the smartphone with respect to gravity (described in section 2.2) (Ayub et al 2012).

Figure 2.

Figure 2. A smartphone (device) attached to the LINAC gantry. The roll rotation axis of the device is perpendicular to the CAX of the medical linear accelerator. In order to receive and display the measured data from the device, a wireless network module was implemented to communicate with another smartphone (displayer) and the device via Bluetooth.

Standard image High-resolution image

2.1.2. Optical image processing module.

An optical-image processing module using an image taken by the high-resolution camera was developed to assess jaw-positioning, crosshair-centering and SSD. A series of image processing steps was employed to determine the location of feature points such as the crosshair-center and edges of X and Y jaws (described in section 2.3).

2.2. Motion sensor signal processing module

To derive a rotation angle of the smartphone, various motion sensors embedded in a smartphone were used together. Each sensor had its own drawbacks (Maenaka 2008) and they were compensated by using the information from the other sensors. In general, a sensor fusion method is a process to combine more than one sensor to obtain better output (Gebre-Egziabher et al 2004, Ayub et al 2012). The gyroscope provides a low-noise rotation angle measurement; however, it does not correspond to the absolute orientation of the smartphone. Despite its ability to derive the absolute orientation of the smartphone, the accelerometer may produce excessive noise. Smoothing such noise with a low-pass filter results in response delay. Therefore, a sensor fusion method primarily uses the integrated gyroscope signal (relative orientation) and prevents the gyroscope signal from drifting through constant correction by the accelerometer signals to gain an absolute orientation of the smartphone with respect to the gravity (figure 3).

Figure 3.

Figure 3. Process of the sensor fusion method. The relative orientation of a smartphone is determined primarily by a gyroscope, which measures an angular velocity. Thereafter, the relative orientation is corrected by an accelerometer or a compass to derive the absolute orientation of a smartphone. The noises of signals from the accelerometer and the compass were removed by low-pass filters.

Standard image High-resolution image

2.3. Optical image processing module

2.3.1. Preprocessing of acquired image.

In order to acquire exact locations of jaw openings from a light field image projected on the treatment couch, image pre-processing was needed. First, a gray-scale image was acquired by normalizing 16 bit RGB images to eliminate illumination effects; thereafter, a binary image was created by dividing pixels into ones above and below half of the maximum intensity as a threshold. Finally, the binary image was segmented with the Canny method (Bao et al 2005) to recognize the crosshair and full-width at half-maximum (FWHM) of the light field.

2.3.2. Length calibration.

To measure the absolute length from an image, images of a known geometry (herein a 100 KRW coin) were taken. A circle detection module was implemented using the Hough transform method to determine the radius of the circle (Roushdy 2007). With the determined radius, we could get a calibration factor as below:

For instance, if the real radius of the known geometry was 12 mm and the number of pixels corresponding to the radius was 50 px, then the calibration factor was determined to be 0.24 mm px–1.

2.3.3. Feature point detection.

The Harris corner detection method was employed to produce second partial derivatives of image intensities. It derives Hessian matrix with second partial derivatives (Harris and Stephens 1988).

For corners, the matrix was characterized by two large eigenvalues, and thus best features from the edge image were obtained. In figure 4, the top five best features referred one to the crosshair and the rest to each jaw's position. For refining the corner locations in sub-pixel accuracy, the intensity gradient from the grayscale image was calculated near each corner. Given the fact that the dot product between two vectors perpendicular to each other is zero, several equations were made and solved using the candidate points near the detected corner by the Harris method.

Figure 4.

Figure 4. The process of the image preprocessing module. Jaw openings and crosshairs were recognized by the Canny method using a grayscale image taken by the high-resolution camera. Then feature points were detected by the Harris method to calculate the locations of each of the jaws and the crosshair.

Standard image High-resolution image

2.4. System evaluation experiment

The object of this evaluation was to determine the accuracy and precision of the system prior to application in a LINAC. In order to exclude errors due to gantry sag or other sources, a smartphone itself was used for evaluation—not attached to the gantry. We derived the mean absolute error (MAE) and the standard deviation (SD) of the test sets to determine the accuracy and precision of the system.

${\rm MAE}=\bar{e}={\rm }\!\!~\!\!{\rm }\frac{\sum{|\Delta {{e}_{i}}|}}{n}$ , where $\Delta {{e}_{i}}$ is error of ith test and n is the number of tests.

${\rm SD}=~\sqrt{\frac{\sum{\left( \Delta {{e}_{i}}-{{{\bar{e}}}^{2}} \right)}}{n}}$ , where $\bar{e}$ is MAE.

By comparing the derived system accuracy and precision with the tolerances recommended by the relevant societies (e.g. IPEM no. 81 and AAPM TG-142 reports), the feasibility of clinical application of the developed system was validated.

2.4.1. System accuracy and precision—angle measurement with motion sensors.

The accuracy and precision of the system for the rotation angle were evaluated by measuring various orientations of the smartphone attached to an optical rotation stage (ST1-418-B4, Sciencetown, South Korea). The roll rotation angle of the smartphone referred to the gantry rotation angle and the yaw rotation angle of the smartphone facing right, referred to as the collimator rotation angle (figure 5). The tests were performed at various angles set up with the optical rotation stage. For the roll rotation, the optical rotation stage varied from  −20° to 20° with a 1° resolution and a 0.03° precision. For the yaw rotation, we measured 0° to 270° rotations of the smartphone attached to the optical rotation stage with a 0.1° resolution and a 0.003° precision. Test sets included 10 known rotations and each of them was repeated three times. We derived MAEs and SDs for 10 test sets.

Figure 5.

Figure 5. Angular accuracy and precision were evaluated by measuring the orientation of the smartphone attached to the optical rotation stage and comparing with the actual angle indicated by the optical stage. The roll rotation angle of the smartphone refers to the gantry rotation angle (left) and the yaw rotation angle of the smartphone facing right refers to the collimator rotation angle (right).

Standard image High-resolution image

2.4.2. System accuracy and precision—length measurement by image processing.

The accuracy and precision for the optical length were evaluated by measuring the sizes of 10 test templates (5  ×  5 cm2, 7  ×  7 cm2, 9.5  ×  9.5 cm2, 9.8  ×  9.8 cm2, 10  ×  10 cm2, 10.2  ×  10.2 cm2, 10.5  ×  10.5 cm2, 12  ×  12 cm2, 15  ×  15 cm2, and 20  ×  20 cm2 squares) at a camera-to-template distance of 50 cm, following calibration with the known geometry of a 5 cm radius circle. Each template size was confirmed by using digital calipers with a 0.03 mm precision (Absolute 500, Mitutoyo, IL, USA). Each of the test sets was repeated three times. We derived MAEs and SDs of 10 test sets.

2.5. Clinical application: monthly mechanical QA

We performed monthly mechanical QA procedures using the developed system. The tests were performed with five medical LINACs: 6EX, 21EX and Trilogy (Varian Medical Systems, CA, USA) installed at Seoul National University Hospital (SNUH); 21IX (Varian Medical Systems, CA, USA) installed at Jeju National University Hospital (JNUH); and Infinity (Elekta, UK) installed at Soonchunhyang University Hospital (SCHUH). In these tests, the device was attached with double-sided tape (3 M 5068, 3 M Corp., MN, USA).

2.5.1. Gantry rotation angle indicator.

The roll rotation axis of the smartphone was perpendicular to the CAX of the medical linear accelerator. Hence, the rotation angle of the smartphone referred to that of the gantry. The gantry rotation was measured at gantry angles of 0°, 90°, 180° and 270°. The angles measured by the developed system were compared to values on the LINAC's digital indicator. Simultaneously, the accuracy of the developed system was confirmed by using a bubble level (manual method). Each test set was repeated twice and experiments were conducted over three or five different days. We derived the MAEs and SDs of the test sets.

2.5.2. Collimator rotation angle indicator.

As an electronic device, a LINAC can interrupt the geomagnetic signal of a compass sensor. Therefore, we used the yaw rotation of a smartphone at the gantry angle of 90°—the smartphone facing right. At this position, we could obtain the absolute yaw rotation of the smartphone with respect to gravity and this referred to the angle of the collimator. The collimator rotation was measured at the collimator angles of 0°, 90° and 270° (180° collimator rotation was not allowed). The measured angles were compared to the LINAC's digital indicator. Simultaneously, the accuracy of the developed system was confirmed by using a bubble level (manual method). Each test set was repeated twice and experiments were conducted over three or five different days. We derived MAEs and SDs of the test sets.

2.5.3. Jaw position indicator.

From the determined location of the feature points as described in section 2.3.3, the light field size was determined by counting the number of pixels between the crosshair and each of the jaw positions. The measured length of a light field was calculated using the calibration factor as follows:

Each jaw position was measured at an SSD of 100 cm and a field size of 10  ×  10 cm2. Thereafter, we intentionally changed the field sizes asymmetrically and confirmed them by using digital calipers (0.01 mm precision) before measuring by using the developed system. Each test set was repeated twice and experiments were conducted over three or five different days. We derived MAEs and SDs of the test sets.

2.5.4. Crosshair centering.

The developed system tracked the real-time position of a crosshair with a video mode, and thus the maximum deviation from the initial position of the crosshair (i.e. crosshair location at the collimator angle of 0°) was measured (figure 6). Each test set was repeated twice and experiments were conducted over three or five different days. We derived MAEs and SDs of the test sets.

Figure 6.

Figure 6. The system can track the real-time position of a crosshair with a video mode, and thus the maximum deviation from the initial position of the crosshair (i.e. crosshair location at collimator angle of 0°) can be measured.

Standard image High-resolution image

2.5.5. Optical distance indicator.

It was assumed that the smartphone was parallel to the image plane projected on the treatment couch and lens distortion was negligible. The pinhole camera model was defined as a one-to-one relationship between real geometry and that projected onto the image plane (figure 7).

Figure 7.

Figure 7. One-to-one relationship between real geometry and one projected onto the image plane to derive a camera-to-surface distance (CSD).

Standard image High-resolution image

In order to calculate a camera-to-surface distance (CSD), a simple formula was derived as follows and all the device-related specifications were from the manufacturer of the smartphone:

F: focal length (mm)  =  4.13 by specification

FL: frame length (px)  =  1920 by camera setting

SL: sensor length (mm)  =  4.69 by specification

RL: real length (mm)  =  100 where SSD  =  100 cm, FS  =  10  ×  10 cm2

IL: measured length by image processing (px)

Finally, an SSD was derived from a calculated CSD by simply adding a source-to-camera distance (SCD). In this study, the SCDs were initially assumed to be 60 cm and 53 cm for the Varian LINACs and Elekta LINAC, respectively. Initial values were referenced in the Monte Carlo simulation package (provided by the manufacturer). However, to confirm the accuracy of two given parameters (SCD and focal length), nine combinations of two parameters were tested. We precisely fixed an SSD of 100 cm and a field size of 10  ×  10 cm2 by using a well-calibrated front-pointer and digital calipers. Discrepancies between measured field size and actual field size (10  ×  10 cm2) were measured with nine combinations of SCD and focal length. The combination resulting in minimum discrepancy was chosen as producint true values of SCD and focal length. With these values, the SSD measured with the developed system was compared to the SSD measured with well-calibrated front-pointers for 90 cm, 100 cm and 110 cm. Ten test sets were performed for three values of SSD.

3. Results

3.1. System accuracy and precision

According to the IPEM no. 81 and AAPM TG-142 reports, the tolerance values of mechanical consistency were 1.0° for gantry/collimator angles and 1.0 mm for the length measurements relevant to jaw positioning, crosshair centering and optical distance indicator. The accuracy MAE and precision SD of the developed system in measuring angles and lengths were determined to be 0.05  ±  0.05° and 0.25  ±  0.14 mm, respectively. Those values would be sufficient to test the mechanical QA items. Detailed results of system accuracy and precision are given in sections 3.1.1 and 3.1.2.

3.1.1. System accuracy and precision—angle measurement with motion sensors.

The results of the system accuracy and precision for the gantry (roll) and collimator (yaw) rotations are shown in table 1. The MAEs with an SD of rotation angle measurements were determined to be 0.05  ±  0.05° and 0.05  ±  0.05° for the roll and yaw rotations, respectively. The MAEs were 0.10° and 0.12°, respectively. Even though the motion sensor resolution specified by the manufacturer (MPU6500, Invensense Inc., CA, USA) was 0.0036°, the system precision was intentionally adjusted to the level of 0.1° due to high sensitivity to noise at high precision.

Table 1. System accuracy and precision: MAEs and SDs of measured rotation angles and lengths for ten test sets. Other sources of error such as gantry sag were excluded.

N  =  10 Gantry angle (°) Collimator angle (°) Field size (mm)
x y Total
MAE  ±  SD 0.05  ±  0.05 0.05  ±  0.05 0.24  ±  0.15 0.26  ±  0.14 0.25  ±  0.14
Max (abs.) 0.10 0.12 0.50 0.50 0.50

Abbreviations: MAE  =  mean absolute error; SD  =  standard deviation; Max  =  maximum error. Note: maximum errors were calculated in absolute terms.

3.1.2. System accuracy and precision—length measurement by image processing.

The accuracy and precision of the system for field size measurements are shown in table 1. With the camera resolution of 4128  ×  3096, the calibration factor was 0.107 mm px–1 when the system was calibrated by using the known geometry of a 5 cm radius circle. The MAEs with an SD of field size measurements for the x and y directions were determined to be 0.24  ±  0.15 mm and 0.26  ±  0.14 mm, respectively. The total MAE with SD for various reference field sizes was 0.25  ±  0.14 mm. The maximum absolute error was 0.50 mm.

3.2. Clinical application: monthly mechanical QA

Table 2 is an uncertainty list of the system and manual methods for LINAC monthly mechanical QA. The uncertainties of each test were taken as the SD. The uncertainties of the system and manual methods in all tests were lower than the tolerances specified in the IPEM no. 81 and AAPM TG-142 reports.

Table 2. Uncertainty list of the system and manual methods for LINAC monthly mechanical QA. The tolerances for each test refer to the IPEM no. 81 and AAPM TG-142 reports.

Procedures uncertainties Gantry rotation angle indicator (°) Collimator rotation angle indicator (°) Jaw position indicator (mm) Crosshair centering (mm) Optical distance indicator (mm)
QA tolerances (IPEM no. 81/AAPM TG-142) 1.00/1.00 1.00/1.00 2.00/1.00 2.00/1.00 2.00/1.00
System 0.04 0.04 0.36 0.35 0.32
Manual 0.08 0.11 0.33 0.27 0.31

Abbreviations: QA  =  quality assurance; IPEM  =  Institute of Physics and Engineering in Medicine; AAPM  =  American Association of Physicists in Medicine; TG  =  task group.

Figure 8 shows the results of monthly mechanical items measured manually and by using the developed system as a box plot. These values were differences between the manual or the system measurements and the LINAC's digital indicators. For all item measurements listed, the accuracy and precision of the developed system were almost equivalent to those of the manual method, while the angle precision (SD) of the system was better than that of the manual method. There was no significant difference between the results of the five LINACs. Detailed results of the assessment for each item are given in section 3.2.1 to 3.2.4.

Figure 8.

Figure 8. Box plot for comparison between the system (green boxes) and manual (red boxes) results. These are differences between the manual or the system measurements and the LINAC's digital indicators. Each test set was repeated twice and experiments were conducted over three or five different days.

Standard image High-resolution image

3.2.1. Gantry and collimator rotation angle indicator assessment.

All the angles measured with the application system were consistent with those indicated by the bubble level. The system always showed better precision than the manual method. The system can also display the absolute angles of the gantry and collimator rotation with respect to gravity within 0.1° precision.

3.2.2. Jaw position indicator assessment.

The MAE  ±  SD of measured field sizes with the system was 0.39  ±  0.36 mm. As described in 2.3.1, a binary threshold was the half of maximum intensity at the edge of the light field; thus we could eliminate the source of human error in identifying the FWHM.

3.2.3. Crosshair centering assessment.

The system well tracked the crosshair in real-time and the MAE of maximum measured deviations was below 1 mm (⩽5 px). With the system, one was able to relieve the effort of drawing several crosslines to measure the crosshair location offset.

3.2.4. Optical distance indicator assessment.

Prior to SSD measurements, two parameters, focal length and SCD, were chosen to minimize the difference between measured and actual field sizes for nine combinations of two parameters listed in table 3. The optimal focal length and SCD were determined to be 4.12 mm and 600.1 mm, respectively. With these parameters, we performed SSD measurements for ODI QA for the Varian LINACs. We did the same with the Infinity LINAC. The optimal focal length and SCD for ODI QA for the Infinity LINAC were determined to be 4.12 and 530.2 mm, respectively.

Table 3. A set of test pairs for SCD and focal length to minimize a discrepancy between measured field size and actual field size. The selected parameters with minimum difference are highlighted.

SCD (mm) Focal length (mm) Diff. (mm)
599.9 4.12 0.42
599.9 4.13 −0.55
599.9 4.14 −1.52
600.0 4.12 0.32
600.0 4.13 −0.65
600.0 4.14 −1.62
600.1 4.12 0.22
600.1 4.13 −0.75
600.1 4.14 −1.72

Abbreviations: SCD  =  source to camera distance; Diff  =  difference (in average).

The MAE  ±  SD of the measured SSD was 0.41  ±  0.32 mm over the all test sets for all LINACs. The results using the application system was also consistent with those using manual front pointers.

4. Discussion and conclusions

Several smartphone applications for radiotherapy have been introduced recently. Ono et al utilized a smartphone motion sensor for patient respiratory monitoring (Ono et al 2011) and Schiefer et al measured iso-center path characteristics of the gantry rotation axis using a smartphone camera, reducing the burden of the Winston–Lutz test (Schiefer et al 2015). However, these applications required additional axillary devices and were not a comprehensive approach for radiotherapy mechanical QA. Welsh et al introduced a single platform assembled variety of QA tools, such as digital inclinometers (Welsh et al 2002). The proposed device was shown to be stable and accurate in clinical tests. However, the system with an axillary device was still prone to human errors for some QA procedures, such as field size assessment and ODI verification.

The developed system is the first smartphone application to comprehensively address the routine mechanical QA process. It does not require specialized equipment or axillary devices—allowing anyone to adopt this method by simply installing the application on a smartphone. Combined use of three motion sensors (sensor fusion) enhanced the precision of determining the smartphone orientation with respect to gravity, and allowed measurement of any gantry/collimator orientation angles. Image processing guaranteed the accuracy and precision of measuring the length of the features (e.g. crosshair location or jaw positions) on the acquired image. The accuracy and precision of the system were measured using an optical rotation stage and digital calipers and were determined to be acceptable for performing mechanical QA procedures to check the accuracy of the digital (or optical) indicators of medical linear accelerators. With this system, manual QA procedures can be automated and thus potential human errors can be eliminated. Initial calibration is strongly recommended, because this system assumes that the rotational axes of the smartphone attached to the gantry are perfectly perpendicular or parallel to the beam path. If they are not perfectly matched, there will be distortion in the projected rectangle (light field) and this can be corrected by rectification (Borkowski et al 2003). To achieve a more reliable and reproducible setup, well-fabricated smartphone holders for various LINAC models are also required. During treatment, therapists fully rely on mechanical parameter values displayed on the LINAC's digital indicator. However, there is no system available for independent on-line monitoring of the digital (optical) indicators. If the smartphone application system developed in this study is embedded into a medical linear accelerator, independent on-line monitoring of mechanical accuracy can be provided during treatment, eventually eliminating time-consuming periodic QA procedures. Moreover, by adding augmented reality techniques, we can apply this system to QA phantom setups. Modules synchronized with LINAC digital indicators are under development to provide more convenient implementation and operator use.

In conclusion, most routine mechanical QA items could be performed by using the smartphone-application system within improved precision and time-frames, while eliminating potential human errors from the conventional manual method.

Acknowledgments

This work was in part supported by the National Research Foundation of Korea (NRF-2013M2B2B1075772 and NRF-2011-0018980) grants and the Nuclear Safety and Security Commission of Korea (490-2015036) grant.

Please wait… references are loading.
10.1088/1361-6560/aa67d5