(Continued in Supplementary Text 1).
The overall design of a smartphone includes the touchscreen display, camera, operating system, network, and many sensors, including an accelerometer, gyroscope, proximity sensor, magnetometer, and others. Smartphones can display images with an expectation of reasonable quality, perhaps suitable for certain standard procedural tasks, like rough guidance of a needle angle, such as for an image-guided biopsy or ablation.
CameraThe built-in smartphone camera enables a variety of functions beyond imaging, such as face-recognition, document scanning, QR code reading, spatial referencing, or measurements. These features enable built-in smartphone cameras to perform a wide range of clinical tasks, from imaging devices to augmented reality (AR)-guided procedures. For such reasons, smartphone cameras have been applied to procedures and surgeries in many specialties, for procedural planning, navigation, imaging, visualization, composite device placement, and verification (such as with tumor biopsy or thermal ablation). Specific clinical examples of diverse camera applications include percutaneous image-guided biopsy, endoscopy, laryngoscopy, and patient positioning for CT planning. Additionally, smartphone-assisted procedures have been reported in orthopedic surgeries, including total hip arthroplasty [4,5,6] and knee arthroplasty [7]. In each application, smartphone cameras were used to improve visualization and procedural planning.
InternetThe internet and available networks are critical enabling factors in communication via smartphones. Smartphone processing units may not have the capacity to rapidly process information quantities above a certain threshold in practical time frames required. In this case, large data may need to be processed or stored remotely on an independent workstation, via internet, cloud, Bluetooth, or network transfer. Smartphone data handling also introduces complex regulatory, security, and privacy issues which may vary depending upon the geography, national laws, research or clinical use, method of use, and intent. Physicians and patients commonly remotely communicate via the internet in telehealth. Diagnostic radiology applications are not reviewed here, but include stroke evaluation via images from non-contrast CT and CTAs uploaded to cloud server and available on smartphones [8].
BluetoothBluetooth allows short-range communication with proximate external hardware. Pairing yields one-way data transfer, with one device as the receiver, and one as the giver. This enables interfacing with local workstations.
AudioAudible feedback can also be used for navigated needle adjustment in percutaneous procedures (e.g., tone change when optimal angle or depth reached or as approaching critical structures) [9]. Audio may also be input for multi-modality artificial intelligence models, where audio acquired with a voice recorder can be converted to mel spectrograms, which model sound frequency over time aligned to a perceptual scale of pitches that humans hears as equidistant, via application of a frequency domain filter, and such images may theoretically be analyzed according to a procedural or medical task, such as determining levels of anesthetic required, analysis of breathing sounds to predict level of awareness or sedation, or classification of procedural pain. Mel spectrograms are used in voice recognition systems such as Siri or Alexa.
SensorsSmartphones have many built-in sensors. Motion sensors include accelerometers, gyroscopes, gravity sensors, light detection and range sensors, magnetometers, and rotational vector sensors. Environmental sensors include temperature, pressure, light, and humidity sensors. Position sensors include orientation and magnetic field sensors. Inertial measurement unit (IMU) sensor is typically an integration of gyroscope, accelerometer and magnetometer to output angular velocity, acceleration, and magnetic fields in multiple axes [10]. Base sensors use data from a single physical sensor. In this review, the gyroscope and accelerometer were the most commonly used sensors in procedural applications in image-guided interventions, such as needle-based biopsy and ablation.
Accelerometer—The accelerometer measures the acceleration of the smartphone via a Micro-Electro-Mechanical Systems (MEMS)—a system of microscopic mechanical sensors and actuators—that converts the change in speed of a seismic mass into a capacitance change in a circuit [11]. Its role is critical in Global Positioning System (GPS) tracking and measuring the speed of movement (Fig. 1).
Fig. 1Components of a smartphone and procedural applications. High definition (HD), extreme dynamic range organic light-emitting diode (XDR OLED), Global Positioning System (GPS), light detection and ranging (LiDAR), microelectronic mechanical systems (MEMS)
Gyroscope—Like the accelerometer, the gyroscope is also a MEMS. It measures the angular velocity of the smartphone in three axes to determine the smartphone’s orientation. It is often used for navigation systems or smartphone games or photographs that require tilting [12]. Many clinical applications use built-in goniometers, which are a combination of an accelerometer and a gyroscope that can be used to track motion and calculate the orientation of a phone [5, 13, 14]. In addition to being able to analyze motion accurately, a smartphone’s compactness grants operators more freedom of movement for better visualization of targeted anatomy, compared to traditional procedural devices. In fact, a smartphone with a built-in camera and gyroscope was used as an inexpensive, widely available, accurate, and reliable guidance device for CT-guided percutaneous needle-based procedures [15]. A mobile application was developed to facilitate needle angle selection; the application brought the planned needle angle to a real-time, low-cost, intra-procedural display orthogonal to the axis of the needle as well as looking down the axis of the needle (along the shaft and hub). The direct visualization of the planned angle provided an alternative to the heavy reliance on the physician's cognitive visuospatial memory and hand-to-eye estimates and skills (Figs. 2a, b, 3a, b, 4a, b, 5). Combined CT and smartphone guidance was significantly more accurate than CT-only “step-and-shoot” guidance for initial needle angle selection and placement. This led to a smaller final targeting error, suggesting that smartphone guidance can improve the accuracy of needle navigation and placement compared with the conventional method. A recent similar study also highlighted the benefits of a gyroscope in needle puncture with improved accuracy and target hit rate [16].
Fig. 2a Smartphone application user interface [15]. a CT image. Lines show planned path on CT console. Number “122” refers to angle of needle in degrees. b Smartphone application user interface [15]. b Screen shot shows interface of mobile application. Green line is guideline that shows planned angle on smartphone's screen. Red circle is center of screen. Buttons are shown in blue. Record button is used to start and Still button is used to stop video recording and image capturing. Camera button is used to switch between front and back cameras. Calibrate button is for registering smartphone to CT scanner. Config, abbreviation meaning “Configure,” button is for setting up wireless connection between smartphone and local PC to allow optional image display on PC and additional control from PC. Needle button is used to set planned needle angle on smartphone
Fig. 3a Photographs show navigation independent from smartphone orientation [15]. a Photographs obtained with user holding smartphone in landscape A and portrait B orientations show that path angle (green line) displayed on smartphone is always referenced to CT. b Photographs show navigation independent from smartphone orientation [15]. b Photographs obtained with user holding smartphone in landscape A and portrait B orientations show that path angle (green line) displayed on smartphone is always referenced to CT
Fig. 4a Photographs show smartphone in Guideline mode [15]. a Smartphone can be placed either in front of needle A or behind needle B for guidance. b Photographs show smartphone in Guideline mode [15]. b Smartphone can be placed either in front of needle A or behind needle B for guidance
Fig. 5Smartphone with custom application for needle angle selection demonstrated in a real patient biopsy. Smartphone with custom application for imaging the needle and reproducing the angle of superimposed (green) needle insertion pathway for a CT-guided biopsy. The pathway was pre-selected with a procedural CT scan and the phone was calibrated to the CT table prior to placement in a sterile translucent bag. The actual needle resides beyond the smartphone, and the shaft and hub are seen superimposed on the needle path plan (green line). The application used the smartphone camera and gyroscope, and interfaces with the procedural CT scan DICOM images via Bluetooth
Applications in Interventional RadiologySmartphone applications in IR are a relatively young space of research. Many studies focus on applications as navigation tools for accurate needle positioning/insertions, as this is a critical step for success in needle-based IR procedures. For this reason, many are exploring the use of virtual and/or augmented reality to bypass larger, bulkier, and more expensive medical devices/technologies to achieve equivalent accuracy and efficiency. Although not many original applications have been published yet, we aim to review the recent exploration of smartphone applications in IR in hopes of stimulating further development of smartphone VR and AR applications intended for procedural settings.
留言 (0)