Incorporation of “Artificial Intelligence” for Objective Pain Assessment: A Comprehensive Review

Introducing Artificial Intelligence in Healthcare

Artificial intelligence is defined as “the simulation of human intelligence processes by machines, especially computer systems. These processes include learning (the acquisition of information and rules for using the information), reasoning (using rules to reach approximate or definite conclusions), and self-correction.” [12]. AI is a growing branch of computer engineering that implements novel concepts and solutions to resolve complex challenges [13]. The opportunities for AI applications in healthcare from the emergency department, outpatient clinics or primary care, to home care, are huge [14]. The spectrum of AI includes but is not limited to, machine learning, deep learning, data mining, and natural language processing [15]. The medical field may benefit from two main categories of AI applications: physical and virtual. The physical subfield of AI in medicine involves medical equipment and care bots (intelligent robots), which assist in delivering medical care, while the virtual subfield includes machine learning (ML), natural language processing (NLP), and deep learning (DL) [16,17,18]. In general, machine learning is a subset of AI, while deep learning is a subset of machine learning (Fig. 2) [19, 120].

Fig. 2figure 2

Categories of artificial intelligence [12]

Artificial intelligence (AI) is a broad field that involves machine learning, deep learning, computer vision, natural language processing, and robotics [12]. Its main goal is to develop smart tools that can carry out cognitive functions such as problem-solving, sentiment analysis, and decision-making. Machine learning (ML) algorithms enable systems to learn and improve from experience without being explicitly programmed. It is a subset of AI and delivers predictions and decisions by identifying patterns in its training data. Deep learning (DL) is a subset of machine learning that uses artificial neural networks to model and understand complex patterns and relationships in data. With this model, an algorithm can determine whether a prediction is accurate through a neural network without human intervention. Deep learning models can build extensive knowledge over time, acting as a brain of sorts [12, 19, 20]. Computer vision (CV) is AI systems that can analyze and understand visual data, such as images and videos. Natural language processing (NLP) refers to AI systems that can understand and generate human language, enabling tasks like speech recognition and language translation. Robotics refers to AI systems that can control and interact with physical robots to perform tasks in the physical world [12].

The last few decades have shown a rapidly growing role of AI in healthcare. The implementation of AI in healthcare can automate patient assessment and eliminate human bias. AI will benefit and improve clinical diagnoses, deliver rapid results, offer continuous monitoring of the patients, facilitate decision-making and help select the treatment plan [21].

Augmented Intelligence as opposed to Artificial Intelligence is being applied to augment human cognition but does not replace it. Augmented intelligence is designed to enhance human capabilities by the effective use of information from the huge data sets to augment and support human cognition [22]. Augmented intelligence can be used in different aspects of medicine and guides the physician in early disease recognition, accurate diagnosis, and decision-making for complex diseases [23]. Both AI and augmented intelligence are likely to play an important role in the future. AI is capable of performing tasks that require human intelligence to achieve and replace humans. While augmented intelligence reflects the enhanced capabilities of human clinical decision-making by combining human intelligence and machine-derived outputs to improve health [24].

AI Techniques in Pain Assessment

Various AI interventions have been utilized in recognizing, assessing, understanding, and treating pain, each with its own effectiveness and applicability [15]. It is important to note that the choice of a model will depend on the specific task and dataset and that different models may perform better for different tasks [10, 25]. Several studies have found promising findings of AI-based pain detection through facial expressions [25]. ML and AI can use data, whether self-report measures, physiological markers, or medical records, to create algorithms to better understand pain, assess changes to pain, and potentially predict treatment outcomes [27, 28]. These technologies can help improve the reliability of pain assessment tools. For example, ML algorithms can be used to assess pain by establishing a baseline and then looking at the patient’s changing facial expressions and analyzing and assessing their vocal sounds [5, 10]. Following are some examples of different AI techniques used for pain assessment and how they are beneficial in improving patient outcomes (see Fig. 2).

Machine Learning (ML) Algorithms

ML is defined as the discovery and testing of algorithms that assist pattern recognition, classification, and prediction based on models built from existing data [29]. ML technology has the capability of teaching computers to recognize patterns and images by supplying them with data and an algorithm [30]. ML algorithms, such as support vector machines, random forests, and deep learning models, have been applied to pain assessment tasks. ML techniques are highly beneficial in accurate pain classification, prediction, estimating pain intensity, and identifying personalized treatment in various conditions, like chronic low back pain or neuropathic pain [31]. They have shown promising results in different pain conditions and can handle complex and nonlinear relationships within the data. They can analyze various data sources, such as physiological signals, self-reported scores, imaging data and estimate pain intensity using EEG signals. ML applications can be used in smart units, digital tools, smart watches, smart documentation, and electronic medical records [32].

Several studies developed novel models for pain recognition with ML by analyzing facial expressions. They were able to automatically detect pain successfully with relatively high accuracy in more than 95% of the subjects [33,34,35,36]. Other studies used the AI-based approach to analyze clinical notes and patients’ records with pain assessment information to identify components related to pain classifications and severity [37]. Results of a recent review provide evidence that machine learning, data mining, and natural language processing can improve efficient pain recognition and pain assessment, analyze self-reported pain data, predict pain, and help clinicians and patients to manage chronic pain more effectively [33,34,35,36]. These promising results may help in the creation of new pain assessment instruments with human language technology.

Another application of AI/ML is for patients with severe dementia and those who cannot verbalize or communicate; a novel method of pain assessment that combines a number of technologies, including automated facial recognition and analysis, smart computing, affective computing, and cloud computing could be developed. This new system could help pain assessment in these challenging cases [38, 39].

Deep Learning and Neural Networks

Deep learning is a branch within the area of ML, AI, as well as data science and analytics, due to its learning capabilities from the given data. Deep learning models have been used to recognize and classify pain-related facial expressions [39]. Deep learning techniques, including convolutional neural networks (CNNs), recurrent neural networks (RNNs), and multilayer perceptron (MLP), are the three primary types of deep networks that are extensively used in many different applications, including complex pain-related data [19]. Medical images such as X-rays or MRI scans can be processed to extract meaningful features and patterns, shaping the diagnosis and treatment plan. Electroencephalogram (EEG) readings can also be analyzed to assess pain levels objectively [41]. DL can also be used to analyze itself and, in that way, overcome unexpected challenges and improve overall outcomes [40].

Natural Language Processing (NLP)

NLP denotes the field of study that emphasizes the interactions between human language and computers [42, 43]. NLP represents an evolution in the language analysis for automated pain assessment of patient-reported outcomes, enhancing the efficiency of pain assessment. NLP has proven effective in analyzing text-based pain assessments, such as pain descriptions from patient reports, pain diaries, or electronic health records [10]. NLP techniques include language feature extraction, classification, and prediction. NLP depends on the interaction between computer science and human language, combining AI and linguistics [10]. NLP has several practical applications in medicine. For example, it can enable computerized clinical decision-support systems, improve healthcare management, and can also be used for building tele-triage services and other aims [44]. NLP can be used to analyze data from social media and identify pain-related experiences. NLP can analyze the language used to describe pain, assist in pain assessment and patient triage, and identify pain-related trends or patterns, with the goal of understanding, extracting, and retrieving data from unstructured written and spoken texts, such as patient descriptions of their pain experiences. NLP has been able to extract information on pain intensity, pain location, and duration from social media data (e.g., pain intensity, location, and duration) [45, 46].

Computer Vision

Computer vision is the field of AI that deals with how computers can gain a high-level of understanding of the patient from digital images or videos. The main applications of computer vision are feature extraction and image segmentation. The main goal of feature extraction is to retrieve a restricted number of relevant features from an image in order to facilitate subsequent tasks, such as classification or regression. On the other hand, image segmentation is labeling each pixel of an image with a corresponding class to detect the relevant elements of an image [47].

Computer vision techniques have demonstrated effectiveness in pain assessment by analyzing visual signs or signals like facial expressions, body postures, or movement patterns. Facial expression analysis, for example, can use deep learning models and computer vision to detect and classify facial signs and behavior related to pain, such as grimacing or frowning [48, 49]. CV/ML pain assessment models derived from automated facial expression measurements performed well in detecting clinically significant pain and estimating pain severity at rest and during movement in the postoperative setting in youth. The automated facial expression measurements positively correlated with the patient’s self-reported pain scores [50].

A study conducted at the University of California San Diego examined computer vision-based deep learning models to predict pain measurements using facial images in 77 patients. The computer focused on facial expressions, particularly the eyebrows, lips, and nose. The study found that the Visual Analogue Scale (VAS) is less accurate than the Critical Care Pain Observation Tool (CPOT). This is attributed to two important factors: first, VAS is a subjective measurement that can be more influenced by emotional and other factors than CPOT; second, there is a gap in the continuous observation methods for pain during the perioperative period. The AI model could help improve patient care through real-time continuous and unbiased pain detection [50, 51].

Moreover, computer vision techniques can be applied to automatically identify pain-related facial expressions in specific populations, such as patients with dementia, non-verbal expressions, or impaired communication. Computer vision algorithms have also been utilized in pain assessment during movement analysis based on movement patterns of the patients [5, 48].

Robotics

AI systems can control and interact with physical robots to perform tasks in the physical world. A robot is a type of programmable machine that is designed to perform specific tasks, interact with the surroundings, and perform duties autonomously. Robots perform the tasks with little or no human intervention and with speed and precision. In healthcare, robotics can enhance patient care, quality assurance, and assist medical professionals. Robotic surgery can assist surgeons during complex procedures, with less invasive techniques, therefore reducing the risk of complications and improving outcomes with enhanced precision and accuracy [52].

Robotic technology can also be used in the field of pain management. Most devices and different techniques were used to differentiate perceived pain or anxiety for the treatment of procedural pain. Overall results showed beneficial effects for using robotic technologies to manage pain for both acute and chronic pain conditions [53]. A recent study used social assistive robots for the assessment and management of pain in residents with dementia, their families and caregivers as well. All received five sessions/week for 3 weeks. Results are promising for using robotic technology with some limitations related mainly to staff training [54].

AI-Based Pain Assessment Tools

AI-based pain assessment tools utilize various high-tech tools to improve the efficiency, accuracy, and objectivity of pain assessment. These tools can assist healthcare professionals in properly assessing, diagnosing, monitoring, and managing pain more effectively. The following represent a few examples of the AI-based tools for pain assessments.

Wearable Devices and Mobile Applications

The World Health Organization defines mobile health technologies as medical and public health practices supported by mobile devices [55]. Smart wearable devices or sensors are AI tools that can enable continuous monitoring of human physiological activities, vital signs, and body movements without any disturbance to the activities of daily living [57]. Wearable-based devices equipped with AI technology offer the potential for real-time continuous pain monitoring and personalized interventions [57]. Wearable devices have the potential to continuously collect a huge amount of data from patients to help in the follow-up and management of many chronic conditions, such as diabetes, heart conditions, and chronic pain [58]. These tools collect data on pain levels, physical activity, sleep patterns, and medication adherence. Wearable devices include, for example: wristbands, smart watches, wearable mobile sensors, and smart wearable shirts [5]. Recent literature reviews on the clinical impact of wearable devices and behavior change have shown promising effectiveness for digital technology [59].

Using AI, wearable smart devices can capture and analyze digital biomarkers of pain. “Digital bio-markers” is a term recently defined as the “objective, quantifiable, physiological, and behavioral measures collected using digital devices that are portable, wearable, implantable, or digestible” [60]. Digital devices can capture vital signs such as heart rate, respiration, blood volume pulse, and skin temperature; physiological data such as skin conductance, brain activity, muscle activity, and pattern of body movements; and behavioral signals such as speech, facial expressions, and body movements [4, 5]. Numerous biosignal methods have been used to assess pain: electrocardiography (ECG), electromyography (EMG), electrodermal activity (EDA), photoplethysmography (PPG), blood oxygen saturation (SpO2), and near-infrared spectroscopy (NIRS) [61]. Wearable-based AI systems have also been explored for pain assessment in non-verbal populations, such as infants undergoing vaccinations or individuals with communication impairments and dementia [5, 61].

Wearable sensors used in pain detection in perioperative settings are based on the reaction of the autonomic nervous system to formulate a valuable index as a pain reference. The two most common medical devices that are used to assess perioperative pain are the “analgesia nociception index” [62, 63], which is based on heart rate variability, and the “surgical pleth index” based on plethysmography [64, 65].

A novel approach by health systems integrates information from electronic health records (EHRs) with wearable sensors. Integrating patient data into wearable devices launches a new era of health technology [66, 67]. Accordingly, several healthcare systems have incorporated devices that can connect wearable devices to EHRs. Integrating these new applications could increase patient data transparency and improve patient care quality [68, 69]. However, they lack system interoperability, cannot be easily integrated into EHR systems, and raise privacy concerns [66, 67].

Virtual Reality (VR) and Augmented Reality (AR)

VR is defined as “simulations that use various combinations of interaction between devices and sensory display systems” [70]. VR and AR technologies have the potential to make interactive, customizable, 3D virtual patients “digital twins” effective for educating healthcare professionals with vast applications and to improve the clinicians’ skills. Also, they are valuable for undergraduate basic and clinical sciences learning [71, 72]. VR and AR technologies are used to create environments that simulate pain-inducing scenarios to analyze user responses, such as physiological changes or behavioral reactions. These technologies can also be used as distraction techniques during painful procedures because they command an individual’s focus, reducing their awareness of pain [

留言 (0)

沒有登入
gif