A virtual patient authoring tool for transcatheter aortic valve replacement

We propose an authoring tool pipeline consisting of a morphable animated 3D model, algorithms to dynamically limit the value ranges, an automatic patient generator, an interface with a physiology simulator, and a virtual catheterisation laboratory (vCathLab) environment to display the model in VR (Fig. 1).

Fig. 1figure 1

Overview of our virtual patient authoring tool

Experimental setup

We used a desktop PC with NVIDIA GeForce RTX 3080, Intel Core i9-10th gen, and 32GB of RAM. Blender was used (Blender Foundation, USA) for our 3D modelling and content creation. Unreal Engine 5.0.3 (Epic Games, USA) was selected for this work. A Meta Quest 2 (Meta, USA) device was connected via Quest Link and interfaced via SteamVR (Valve Corporation, USA) for the VR development. The physiology of the patient is simulated using Pulse Physiology Engine (PPE) (Kitware, USA).

Identification of anatomical features

Understanding the role of the anatomical structures in disease and treatment is one of the leading training goals in TAVR education [18]. Hence, providing reliable information during the training is critical. To achieve a comparable result with patient-data-dependent models, we identified the major anatomical features impacting a TAVR procedure, from the treatment selection to operation [18]. Using medical literature, the mean and range of these variables were determined (see Table 1).

Table 1 List of anatomical features, their measurements, and relative PVA group informationAnatomical dataset

We used the Z-anatomy anatomical model library (Z-Anatomy, Gauthier Kervyn) as the basis for our 3D heart model. This library provides a variety of 3D models of healthy organs and systems based on previously acquired medical images. We included the aorta (up to abdominal bifurcation), coronary arteries, heart muscle, left and right atrium, aortic valve, and left ventricle. These meshes were further modified in Blender to fulfil our needs, e.g. polygon reduction, improving the valves polygon count, refining the ventriculoarterial junction (VA), isolating the left ventricle from the heart muscle, creating the sinotubular junction (SA) bulging, and assigning separate materials to the aorta, each valve leaflet (inner and outer surfaces), left ventricle, heart muscle, left and right coronary arteries (LCA and RCA). Further modifications are discussed below (see Sect Morphing of Anatomical Features).

Morphing of anatomical features

Per-vertex animation (PVA) is an established method for vertex-level manipulation of a 3D model, for instance in facial animations. In this method, the location of the vertices are controlled by interpolating between two predefined locations using an interpolation value [25]. We implemented PVA for our heart model based on the values in Table 1 (see Fig. 2). Using the in-app measuring tool, we measured each feature (e.g. diameter, height, circumference) while applying our vertex modification to achieve realistic results. These values do not cover the whole reported ranges due to limitations of the heart model. While this excludes extreme cases, it still covers a wide variety of anatomies. To improve PVA malleability, some anatomical features were broken down into multiple PVA groups, namely in the ostia of the arteries, which consist of a separate shifting group for up, down, left, and right. Furthermore, adjusting the vertices in smaller groups, in combination with manual implementation of PVA values for the extreme cases, ensures that minimum distortions and self-intersections appear in the model after changing PVA values.

PVA offers computationally cheap and highly detailed animation sequences by saving different values on a timeline. We used this method to create animation sequences for the beating of the heart muscle, coronary arteries, left ventricle, and aorta. While these sequences are synced visually, their values are accessible and can be adjusted separately. This offers the opportunity for modifiable heart animations based on the simulated patient of “PPE”, such as rapid pacing.

Fig. 2figure 2

A PVA example for the aorta in orange colour, prepared in Blender. Aorta PVA interpolation value, \(left = 0\) \(right=0.7\) normalised between 0 and 1

Calculation of surgical risk

Surgical risk scores are one criterion for selecting the proper treatment for each patient. A widely used scoring system is Euroscore II, which indicates the mortality rate in heart surgeries for adult patients [26]. Considering this score, the heart team would decide whether to do open surgery, intervention, or neither. We include Euroscore II to add more possibilities and realism to our cases, and to simulate physiological behaviours and complications based on some of these parameters.

Environments

According to our clinical partners, the user should be able to navigate the visualisation of the heart model, toggle the visibility of heart regions, generate a new random patient, change PVA values and see the changes made to the model in real-time, read and change the Euroscore parameters, and save and load the generated model. Additionally, the user should be able to see the generated model inside the vCathLab using a fluoroscopy device, showcasing the generated patient. To implement these functionalities, we divided them into two environments. The first environment (authoring tool) includes everything related to generating the desired patient, and the second contains the vCathLab, which can be viewed on desktop and VR platforms.

Fig. 3figure 3

Authoring tool environment with top) Euroscore II information tab and bottom) Morphology sliders tab visible

Authoring tool

This environment provides the means for generating the VPs, the main goal of this work, and consists of the heart model and the Euroscore and morphology parameters, which are accessed via four tabs (i.e. Euroscore II information, planar control, morphology control, and morphology sliders (see Fig. 3)). Our design choices, such as UI and visualisation designs, were inspired by conventions in medical literature and software (see Fig. 4). The model viewer can be navigated by rotation, panning, and zooming. To view the underlying structures of the heart, the visibility toggle feature was introduced, controlling each segment of the heart separately. Moreover, a planar control module aids the user by snapping the view to sagittal, axial, and coronal planes and providing clipping planes for each.

Fig. 4figure 4

Medical design conventions as shown in left) Syngo Aortic Valve Guide (Siemens Healthcare GmbH, Germany) and right) the developed environment

The Euroscore information can be viewed and modified through numerical inputs, dropdowns, and checkboxes, yielding new patient pre-condition criteria and changing the physiological behaviour. The Euroscore parameters are the counterpart to our morphology manipulation and therefore, a direct way of interacting with patient physiological simulation. This approach grants access to physiological modifications through limited, standardised, and known parameters, instead of an excessive amount of sliders and checkboxes. After changing each value, the Euroscore is updated.

One key feature of our system is the automatic generation of a new patient, both in anatomical and Euroscore parameters. These parameters are separated into two groups: dependent and independent (see Table 1). Accordingly, the pseudo-random algorithm generates the independent variables at first and, based on them, the dependent ones. For the Euroscore values, a probability multiplier was introduced that controls the outcome of each argument. These multipliers are provided by an expert and are approximated based on frequent cases. For some anatomical features, the parameters are controlled via multivariate regression functions that prevent unrealistic numbers if a specific condition is met, i.e. a narrower ST junction than the valve diameter. Each anatomical variable is picked from the ranges mentioned in Table 1).

Following the automatic model generation, the morphology control button enables the sliders tab, where the anatomical features can be modified. The tab provides the name, unit, and (in almost all cases) the amount of the anatomical parameter being affected. These sliders control groups of PVA values that are relative to one anatomical feature. In this way, we avoid redundancy and clutter in the workspace. We ensured that 2/3 of the screen is always dedicated to the display of the geometric model. Furthermore, the control tabs can be toggled to reduce workspace occupancy.

After creating the preferred patient, the user can move to the vCathLab to inspect the model in a fluoroscopy device. In parallel, a patient database is generated from the Euroscore values and communicated to PPE. After the PPE is initiated, the new environment loads and is ready for interaction. A save and load feature can help the user to revisit or utilise the model multiple times.

Fig. 5figure 5

Virtual CathLab top) overview with contrasted coronary arteries, and bottom) monitoring screen and contrasted left ventricle and aorta

Virtual cathLab

While the focus of this work remains on the generation of a virtual patient using the authoring tool, we developed a VR environment to test the performance of the proposed method in a complex and computationally expensive scenario. This environment aims to provide the user with an immersive experience of a vCathLab. A virtual model of an ARTIS icono floor fluoroscopy system (Siemens Healthcare GmbH, Germany) is placed in this room with two monitors and a patient lying on the operation bed (see Fig. 5). The physiological parameters are viewed on the right monitor, and the live fluoroscopic images generated from the patient is rendered on the left monitor. The positions of the C-arm and the table can be manipulated by grabbing the virtual joysticks on the bedside control panel using headset controllers. The contrast agent injection can be triggered separately for coronary arteries, and for the ventricle and aorta. The virtual fluoroscopy device aims to represent a virtual twin of the system, helping the user understand the workflows and operation of such devices in different treatment scenarios. Hence, we recreated its interactions, movement, and imaging behaviours as realistically as feasible.

To generate the fluoroscopic images, a virtual camera with ray-tracing capabilities was attached to the virtual C-arm’s tube section. This camera captures only the translucent patient model in a dedicated lighting channel and applies a post-processing shader to achieve the fluoroscopic appearance. Finally, it renders the camera view to a render target texture, which is shown on the left monitor. Generating a replica of the imaging characteristics and of the pipeline of the fluoroscopy device was demanding and beyond the scope of this project. Therefore, we considered and implemented the relative graphical aspects in the post-processing shader to address the resource consumption of the virtual imaging system. These aspects are the grayscale visualisation, the grain noise texture and amount, the density of the tissue represented by alpha channel values, ray-tracing and scattering behaviour in volumes, the intensity of the X-ray source, the contrast agent movement and concentration, and the additive effects of digital subtraction.

Evaluation

We evaluated our prototype through a subjective usability scoring, qualitative feedback from the experts, and performance benchmark in VR. Seven board-certified physicians (two cardiologists and five cardiac surgeons) took part in our evaluation. These physicians participated in the study under the project’s joint research agreement and were the primary representatives of our demography, i.e. trainers and trainees, with a professional interest and involvement in TAVR procedures. Five participants have been actively involved in teaching for an average of six years (from one to fifteen years); five perform an average of 43.6 TAVR procedures per annum (ranging from five to 100); and two of the cardiac surgeons were in training for TAVR procedures. The evaluation was carried out in two sessions formats, i.e. experts feedback and performance benchmark sessions (see Fig. 6).

We used System Usability Scale (SUS) Questionnaire [27] for our usability test. This well-established test is based on ten simple questions with Likert scale responses, enabling the user to subjectively rate their experience with the system from three aspects of effectiveness, efficiency, and satisfaction, on a scale of zero to 100 (worst to best). Bangor et al. devised an adjective rating scale for SUS to make the scale more readable and simpler to interpret [28]. This adjective rating scale consists of seven states, i.e. Worst Imaginable, Awful, Poor, OK, Good, Excellent, and Best Imaginable.

Frames-per-second (FPS), i.e. framerate, represents the smoothness of a virtual experience and thus the level of immersion in such environments. As the experience gets more complex and more computational power is required, e.g. in VR, the time needed to render one image from the virtual scene is increased, leading to lower FPS. The hardware capabilities, e.g. CPU and GPU, directly impact the FPS. However, the bottlenecks and resource consumption stay relatively the same across different hardware. Since FPS dictates input lag and perceived smoothness, it is a fundamental factor to the overall immersive experience and cybersickness state [29]. Furthermore, it encapsulates the relative computational processes, requirements, and bottlenecks in computer graphics. Hence, the FPS was selected as the measure for reporting the performance. FPS values can be recorded and analysed using benchmarking and profiling tools, which in our case is, provided by Unreal Engine.

To evaluate the performance of the generated patient in the VR environment, we ran a benchmark test using the Unreal Engine profiling tool. In each session, we generated a new random patient. We then transferred the model to our vCathLab, where we inspected it through the developed fluoroscopy and contrasting methods for approximately 3 min. During these tests, the PPE ran in parallel. The performance test was run by the VR developer to ensure all the functionalities and bottlenecks are assessed properly within the time frame.

Fig. 6figure 6

Top) Evaluation sessions overview Appendix A and 2.6, bottom left) an expert during the expert evaluation session, bottom centre) the user during a VR session, and bottom right) Meta Quest 2 device used for VR benchmarking

留言 (0)

沒有登入
gif