Methodological Frameworks and Dimensions to Be Considered in Digital Health Technology Assessment: Scoping Review and Thematic Analysis


IntroductionBackground

Digital health technologies (dHTs) are driving the transformation of health care systems. They are changing the way in which health services are delivered, and showing great potential to address some of the major challenges that European health systems, including the Spanish National Health System (SNS), are facing, such as the progressive aging of the population [,]; the growing demand for health and long-term care services []; the rise in health care costs, increasing financial pressures on health and welfare systems [,]; and the unequal distribution of health services across different geographical regions [,]. In addition, dHT can improve the accessibility, sustainability, efficiency, and quality of health care systems [,], leading to their becoming a determinant of health on their own [,].

However, the digital transformation of health care systems and the implementation of dHT (eg, artificial intelligence [AI]–based solutions, data-driven health care services, or the internet of things) are slow and unequal across different European regions [,]. Some of the reasons for this are (1) the immaturity of regulatory frameworks for the use of dHTs [], (2) the lack of funding and investment for the implementation of dHTs [], (3) the lack of sufficient and appropriate infrastructures and common standards for data management [,], (4) the absence of skills and expertise of professionals and users [], and (5) the scarcity of strong evidence regarding the real benefits and effects of dHTs on health systems and people’s well-being, as well as the cost-effectiveness of these technologies. This makes decision-making difficult, potentially leading to the development and reproduction of low-value and short-lived dHTs [,].

To overcome these challenges, harness the potential of dHTs, and avoid nonintended consequences, the World Health Organization (WHO) [,] states that dHTs should be developed under the principles of transparency, accessibility, scalability, privacy, security, and confidentiality. Their implementation should be led by robust strategies that bring together leadership, financial, organizational, human, and technological resources, and decisions should be guided by the best-available evidence [,].

Regarding this last aspect, health technology assessment (HTA), defined as a “multidisciplinary process that uses explicit methods to determine the value of a health technology at different points in its life cycle,” is a widely accepted tool to inform decision-making and promote equitable, efficient, and high-quality health systems [,].

Generally, HTA is conducted according to specific methodological frameworks, such as the HTA Core Model of the European Network for Health Technology Assessment (EUnetHTA) [] and the guidelines for the development and adaptation of rapid HTA reports of the Spanish Network of Agencies for Assessing National Health System Technologies and Performance (RedETS) []. These frameworks establish the methodologies to follow and the elements to evaluate. Although these frameworks are helpful instruments for evaluating various health technologies, they have certain limitations in comprehensively assessing dHTs. For this reason, in the past few years, different initiatives have emerged to adapt existing methodological frameworks or develop new ones. The objective is to consider additional domains (eg, interoperability, scalability) to cover the intrinsic characteristics of dHTs [-]. Examples of these initiatives are the Evidence Standard Framework (ESF) of National Institute for Health and Care Excellence (NICE) [] or the Digi-HTA Framework of the Finnish Coordinating Center for Health Technology Assessment (FinCCHTA) []. Nonetheless, the majority of these frameworks have certain constraints, such as being designed for a particular socioeconomic or national setting, which restricts their transferability or suitability for use in other countries; the specificity or exclusion of certain dHTs, resulting in limitations in their application; or the limited evidence regarding their actual usefulness.

In this context, we performed a scoping review (ScR) with the aim of identifying the methodological frameworks that are used worldwide for the evaluation of dHTs; determining what dimensions and aspects are considered for each type of dHT; and generating, through a thematic analysis, a proposal for a methodological framework that is based on the most frequently described dimensions in the literature. This research focused mainly on mobile health (mHealth), non–face-to-face care models and medical devices that integrate AI, as these particular dHTs are the ones most frequently assessed by HTA agencies and units of RedETS.

Identifying Research Questions

This ScR followed by a thematic analysis answered the following research questions:

What methodological frameworks currently exist for digital health technology assessment (dHTA)?What domains and dimensions are considered in dHTA?Do the different domains and dimensions considered depend on whether the dHT addressed is a non–face-to-face care model of health care provision, a mobile device (mHealth), or a device that incorporates AI?
MethodsOverview of Methods for Conducting the Scoping Review

We conducted an ScR of the literature and a thematic analysis of the studies included according to the published protocol []. The ScR aimed to answer the first research question, while the thematic analysis aimed to answer the second and third research questions. Spanish experts from various domains of HTA and dHT collaborated throughout the study design and development.

The ScR of the available scientific literature was carried out in accordance with the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews) guidelines () [] and following the recommendations of Peters et al [] and Pollock et al [].

Ethical Considerations

As this work was an ScR, no ethical board approval was required.

Search Strategy

The search strategy () was designed by an experienced information specialist (author RP-P) in accordance with the research questions and using the validated filter of Ayiku et al [] for health apps, adding the terms for concepts related to mHealth, remote care models, AI, digital health, methodological frameworks, and HTA. The strategy was peer-reviewed according to the “Peer Review of Electronic Search Strategies Statement” [] by authors JS-F and CM-P and was executed in the following 7 databases, considering the characteristics of each in terms of syntax, controlled vocabulary, and proximity operators: Medline (OVID), CINAHL Plus, Embase, Cochrane Library, Scopus, Web of Science, and TripDatabase. Note that no time, language, or other filters were used.

The identification of relevant studies was complemented with a manual search based on the references in the included studies, as well as the websites of the HTA agencies identified through the web pages of EUnetHTA, the International Network for Agencies for Health Technology Assessment (INAHTA), and Health Technology Assessment International (HTAi). Additionally, a search was conducted in Google Scholar, limiting the results to the first 250 items in order to guarantee the inclusion of all pertinent studies [].

Inclusion and Exclusion Criteria

The inclusion criteria used in the reference-screening process were based on the previously detailed research questions and are outlined in using the Population/Problem, Phenomenon of Interest, Context and Design (PICo-D) format [,]. The PICo-D format was used instead of the traditional Population/Problem, Intervention, Comparator, Outcomes, Design (PICO-D) format due to the qualitative nature of the research questions and the characteristics of the phenomenon of interest.

Studies were excluded if they were published before 2011, due to the rapid evolution of dHTs in the past few years, did not describe dimensions or evaluation criteria, or were based on methodological frameworks not intended for the assessment of dHTs (eg, EUnetHTA Core Model 3.0). Likewise, we excluded comments, editorials, letters, conference abstracts, frameworks, or tools focusing on the evaluation of dHTs by users (eg, User version of Mobile App Rating Scale [uMARS]) or documents in languages other than English, Spanish. or Catalan.

Textbox 1. Research questions in Population/Problem, Phenomenon of Interest, Context and Design (PICo-D) format.

Population/problem

Digital health technology assessment (dHTA)

Phenomenon of interest

Specific methodological frameworks for the evaluation of digital health (with special focus on mobile health [mHealth]: non–face-to-face care models and medical devices that integrate artificial intelligence [AI] due the type of technologies mostly assessed in the Spanish National Health System [SNS]) that describe the domains to be evaluated in dHTA

Context

Health technology assessment (HTA)

Design

Methodological guidelines and frameworks, scoping reviews (ScRs), systematic reviews (SRs), consensus documents, and qualitative studies

Reference Screening and Data Extraction

The screening of studies was carried out by authors CM-P and JS-F in 2 phases in accordance with the selection criteria detailed earlier () and in a single-blind peer review manner. The first phase consisted of screening of the titles and abstracts of the studies identified in the bibliographic search. The second phase consisted of full-text screening of the studies included in the previous phase.

Data extraction was performed by 3 authors (CM-P, RP-P, and JS-F) using the web and desktop versions of ATLAS.ti version 22.0 (Scientific Software Development GmbH) [] and the data extraction sheets designed ad hoc for this purpose following the recommendations of the Cochrane Handbook for Systematic Reviews of Interventions [].

When disagreements emerged in either of the 2 processes, a consensus was reached between the 3 reviewers (CM-P, RP-P, and JS-F). When a consensus was not possible, a fourth reviewer (author RMV-H) was consulted.

Collecting, Summarizing, and Reporting the Results

A descriptive analysis was carried out to evaluate and report the existing methodological frameworks and their characteristics.

Overview of Methods for Thematic Analysis

The thematic analysis was performed following the recommendations and phases described by Thomas and Harden [] to determine HTA dimensions for dHTs: (1) line-by-line text coding, (2) development of descriptive topics, and (3) generation of analytical themes. Both analyses were carried out by 3 authors (CM-P, RP-P, and JS-F) using the web and desktop versions of ATLAS.ti version 22.0 [].

Dimensions identified from systematic reviews (SRs) that were derived from primary studies also identified in our systematic search were only counted once in order to avoid duplication of data and risk of bias. It is worth mentioning that the primary studies included in the SRs were not directly analyzed but were analyzed through the findings reported in the SRs.


ResultsStudy Selection and Characteristics

A total of 3042 studies were retrieved throughout the systematic (n=3023, 99.4%) and the manual (n=19, 0.6%) search. Of these, 2238 (73.6%) studies were identified as unique after removing duplicates.

After title and abstract review, 81 (3.6%) studies were selected for full-text review, of which 26 (32.1%) were finally included in the analysis. The excluded studies and reasons for exclusion are detailed in ; in brief, the reasons for exclusion were phenomenon of interest (n=30, 37%), type of publication (n=15, 18.5%), purpose (n=6, 7.4%), language (n=2, 2.5%), and duplicated information (n=2, 2.5%). The study selection process is outlined in [].

Of the 26 (32.1%) studies included in this ScR, 19 (73.1%) were designed as specific methodological frameworks for dHTA [,,-], 4 (15.4%) were SRs [-], 1 (3.9%) was a report from the European mHealth Hub’s working group on mHealth assessment guidelines [], 1 (3.9%) was a qualitative study [], and 1 (3.9%) was a viewpoint []. In addition, 3 (11.5%) focused on the assessment of non–face-to-face care models [-], 8 (30.8%) on mHealth assessment [-,,,], 2 (7.7%) on the assessment of AI technology [,], 4 (15.4%) on eHealth [,,,], and 9 (34.6%) on the overall assessment of digital health [,,-,,,].

Figure 1. PRISMA 2020 flow diagram of the search and study selection process for new SRs, meta-analyses, and ScRs. PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analysis; ScR: scoping review; SR: systematic review. Research Question 1: Description of Identified Frameworks for dHTA

The 19 methodological frameworks for dHTA [,,-] were from various countries: The majority (n=5, 26.3%) originated in Australia [,,,,], followed by 3 (15.8%) from the United States [,,] and 2 (10.5%) from Switzerland [,]; the remaining 9 (47.4%) frameworks were developed in Afghanistan [], Denmark [], Scotland [], Finland [], Ireland [], Israel [], the United Kingdom [], Spain [], and Sweden [].

The 19 methodological frameworks focused on evaluating various types of technologies. Specifically, 3 (15.8%) of them were designed for assessing non–face-to-face care models [-], 6 (31.6%) for mHealth [-], and 1 (5.3%) for AI solutions []. The other 9 (47.4%) frameworks addressed eHealth [,,] or digital health in general [,,-], which encompasses non–face-to-face care models, mHealth, and occasionally AI-based solutions [] within its scope. It is pertinent to mention that the differentiation between the methodological frameworks designed for the evaluation of eHealth and those designed for dHTA was based on the specific terminology and descriptions used by the authors of those frameworks.

The structures and characteristics of the analyzed methodological frameworks were considered heterogeneous in terms of evaluation specificity (whether they focused on a global evaluation that encompassed more than 1 domain or dimension or on a specific assessment that addressed only 1 domain or dimension), assessment approach (whether they adopted a phased evaluation, a domain evaluation, or a hybrid of both), and number of domains included. Regarding evaluation specificity, 17 (89.5%) methodological frameworks were classified as global as they covered various aspects or domains within their scope [,,-,-,,], while 2 (10.5%) were classified as specific as they concentrated exclusively on 1 element or domain of assessment [,]. Regarding the assessment approach, 14 (73.7%) methodological frameworks proposed a domain-based evaluation [, , , , , -, , , , , ], while 4 (21.1%) proposed a hybrid one (phased and domain based) [,,,]; the remaining methodological framework did not fit into any of the previous categories, as it was not structured by domains or phases but by types of risk []. Finally, the number of evaluation domains considered ranged from 1 to 14, with an average of 7. outlines the primary features of the included methodological frameworks and provides a thorough breakdown of the domains and dimensions they address.

In contrast, from 3 (75%) [-] of the 4 SRs [-] and the report from the working group on guidelines for the evaluation of mHealth solutions from the European mHealth Hub [], we identified other methodological frameworks and tools focusing on the assessment of dHTs. Specifically, we identified 16 methodological frameworks or tools focusing on the evaluation of non–face-to-face care models [-], along with 37 for the evaluation of mHealth [,,-], 11 for the evaluation of eHealth [-], and 17 for the evaluation of dHTs in general [-]. Additionally, 5 (26.3%) [,,,,] of the 19 methodological frameworks included in this ScR were also identified and analyzed in 1 or more of the 4 literature synthesis documents [-]. It is important to note that the difference between the frameworks we retrieved through our systematic search and those identified in the 4 SRs is the result of the narrower perspective we adopted, focusing exclusively on frameworks directly relevant to the HTA field, in line with the aims of our study. In , we provide a more detailed explanation of the methodological frameworks included in the studies mentioned earlier [,-,-,-].

Table 1. Methodological frameworks (N=19) included in this ScRa.Methodological framework, yearCountryAssessment
specificityAssessment
approachAssessment domains, nMethodological frameworks focusing on the assessment of non–face-to-face care models (n=3, 15.8%)
Model for Assessment of Telemedicine Applications (MAST), 2012 []DenmarkOverall evaluationBy domainHealth problem and application description; security; clinical effectiveness; patient perspective; economic aspects; organizational aspects; sociocultural, ethical, and legal aspects (n=7)
Scottish Centre for Telehealth & Telecare (SCTT) Toolkit []ScotlandOverall evaluationBy domainUser benefits/costs; benefits/service costs; user experience; increased use of the technology platform available to support routine local services; confidence in the use and awareness of staff; awareness of telehealth and telecare as tools (n=6)
Telehealth framework, 2014 []AustraliaOverall evaluationBy domainHealth domain; health services; communication technologies; environment configuration; socioeconomic evaluation (n=5)Methodological frameworks focusing on the evaluation of mHealthb technologies (n=6, 31.9%)
Caulfield’s evaluation framework, 2019 []IrelandOverall evaluationBy domainContext information; cost information; normative compliance; scientific evidence; human factors; data collection and interpretation (n=6)
mHealth-based technology assessment for mobile apps, 2020 []SpainOverall evaluationBy domainGeneral information about the clinical condition and about the mHealth solution; privacy and security; technological aspects and interoperability; evidence and clinical effectiveness; user experience, usability, acceptability, ease of use, and aesthetics; costs and economic evaluation; impact on the organization (n=7)
Henson’s app evaluation framework, 2019 []IsraelOverall evaluationBy domainContext information; privacy/security; scientific evidence; usability; data integration (n=5)
Lewis’s assessment risk framework, 2014 []United KingdomRisk/safety assessmentN/AcRisk (n=1)
Mobile medical app evaluation module, 2020 []AustraliaOverall evaluationBy domainDescription and technical characteristics; current use of technology; effectiveness; security; effectivity cost; organizational aspects; ethical aspects; legal aspects; postmarket monitoring; social aspects (n=10)
Vokinger, 2020 []SwissOverall evaluationBy domainPurpose; usability; information accuracy; organizational reputation; transparency; privacy; self-determination or user control (n=7)Methodological frameworks focusing on the assessment of solutions based on AId (n=1, 5.3%)
Translational Evaluation of Healthcare AI (TEHAI), 2021 []AustraliaOverall evaluationHybridAbility; utility; adoption (n=3)Methodological frameworks focusing on the evaluation of eHealth technologies (n=3, 15.8%)
Health Information Technology Evaluation Framework (HITREF), 2015 []United StatesOverall evaluationBy domainStructural quality; quality of information logistics; unintended consequences/benefits; effects on quality-of-care outcomes; effects on process quality (n=5)
Heuristic evaluation of eHealth interventions, 2016 []United StatesOverall evaluationBy domainUsability/ease of use/functionality; aesthetics; security; content; adherence; persuasive design; research evidence; owner credibility (n=8)
Khoja-Durrani-Sajwani (KDS) Framework, 2013 []Afghanistan, Canada, Kenya, PakistanOverall evaluationHybridHealth service results; technology results; economic results; sociotechnical and behavioral results; ethical results; preparation and change results; results of the regulation (n=7)Methodological frameworks focusing on the evaluation of dHTse (n=6, 31.6%)
Deontic accountability framework, 2019 []AustraliaEthical evaluationBy domainEthical principles (n=1)
Digi HTA, 2019 []FinlandOverall evaluationBy domainCompany information; product information; technical stability; costs; effectiveness; clinical safety; data protection and security; usability and accessibility; interoperability; AI; robots (n=11)
Digital Health Scorecard, 2019 []United StatesOverall evaluationHybridTechnical validation; clinical validation; usability; costs (n=4)
Framework for the design and evaluation of digital health interventions (DEDHI), 2019 []SwedenOverall evaluationBy domainEasy to use; content quality; privacy and security; responsibility; adherence; aesthetics; perceived benefits; effectiveness; quality of service; personalization; perceived enjoyment; ethics; security (n=13)
Monitoring and Evaluating Digital Health Interventions Guide, 2016 []SwissOverall evaluationHybridCosts; feasibility; usability; effectiveness; implementation science; efficiency; quality; use (n=8)
Precision Health Applications Evaluation Framework, 2021 []AustraliaOverall evaluationBy domainNovelty; adaptability; information management; performance; clinical effectiveness; quality assurance (n=6)

aScR: scoping review.

bmHealth: mobile health.

cN/A: not applicable.

dAI: artificial intelligence.

edHT: digital health technology.

Research Question 2: Domains and Dimensions Being Considered in dHTA

The 26 (32.1%) studies included encompassed a broad range of items to consider in dHTA and often used diverse expressions for analogous concepts. We reduced this heterogeneity through our thematic analysis according to the recommendations and phases described by Thomas and Harden [].

In this sense, in the first phase of thematic analysis, we identified and coded 176 units of meaning (coded as provisional codes) that represented different items (domains or dimensions) of the assessment. These units were then grouped into 86 descriptive themes (second phase), which were further refined into 61 analytical themes that captured the key concepts and relationships between them (third phase). Lastly, the 61 analytical themes were arranged in a 3-level vertical hierarchy based on the evidence: level 1 (12 domains), level 2 (38 dimensions), and level 3 (11 subdimensions). We used the term “domain” to refer to a distinct area or topic of evaluation that is integral to the assessment of the technology in question. A domain may encompass multiple related concepts or dimensions that are relevant to the evaluation. Each dimension, in turn, represents a specific aspect of evaluation that belongs to the domain and contributes to an understanding of its overall significance. Finally, a subdimension refers to a partial element of a dimension that facilitates its analysis. By using these terms, we aimed to provide a clear, rigorous, and comprehensive framework for conducting HTA.

displays the 61 analytical themes in descending order of coding frequency, aligned with the hierarchy derived from the data analysis. Additionally, the table specifies the intervention modalities or dHTs that correspond to each code and lists the studies from which each code originated. The network of relationships among the codes can be found in .

Table 2. Analytical themes of the thematic analysis presented in descending order of coding frequency and aligned with the hierarchy derived from the data analysis.Domain (level 1) and dimensions (level 2)Subdimension (level 3)Type of dHTaDescription of the technology (n=19, 6.2%)Non–face-to-face care models [,], mHealthb [,-,], AIc [], eHealth [], digital health [,,,]
Credibility and reputation (n=5, 1.6%)—dmHealth [,], digital health []
Scientific basis (n=5, 1.6%)—mHealth [,,], digital health [,]
Technical evaluation and validation (n=3, 1.0%)—mHealth [], digital health []
Adoption (n=2, 0.6%)—AI [], digital health []
Adoption (n=2, 0.6%)Usage (n=2, 0.6%)AI [], digital health []
Adoption (n=2, 0.6%)Integration (n=1, 0.3%)AI []
Information management (n=2, 0.6%)—eHealth [], digital health []
Novelty (n=1, 0.7%)—Digital health []Safety (n=19, 6.2%)Non–face-to-face care models [], mHealth [-,], AI [], eHealth [,], digital health [,,,,]
Clinical safety (n=12, 3.9%)—Non–face-to-face care models [], mHealth [,,], AI [], eHealth [,], digital health [,,,]
Technical safety (n=11, 3.6%)—Non–face-to-face care models [], mHealth [,,], digital health [,,,,] Clinical efficacy and effectiveness (n=17, 5.5%) Non–face-to-face care models [,], mHealth [,,,], eHealth [,], digital health [,,,,,]Economic aspects (n=16, 5.2%)Non–face-to-face care models [-], mHealth [,,], AI [], eHealth [,,], digital health [,,,,]
Costs (n=10, 3.2%)—Non–face-to-face care models [,], mHealth [,], AI [], digital health [,,,,]
Economic evaluation (n=7, 2.3%)—Non–face-to-face care models [,], mHealth [,], eHealth [,], digital health []
Use of resources (n=4, 1.3%) and efficiency (n=1, 0.3%)—Non–face-to-face care models [], mHealth [], AI [], digital health [,]Ethical aspects (n=13, 4.2%)Non–face-to-face care models [], mHealth [,], AI [,], eHealth [,,], digital health [,,,]
Equity (n=1, 0.3%)—Digital health []
User control and self-determination (n=1, 0.3%)—mHealth [], digital health []
Responsibility (n=1, 0.3%)—Digital health [,]
Explainability (n=1, 0.3%)—Digital health []Human and sociocultural aspects (n=13, 4.2%)Non–face-to-face care models [,], mHealth [,,], AI [], eHealth [,], digital health [,,]
User experience (n=7, 2.3%)—Non–face-to-face care models [], mHealth [,,], digital health [,,]
Accessibility (n=3, 1.0%)—mHealth []
Acceptability (n=2, 0.6%)—mHealth [], AI []
Engagement (n=2, 0.6%)—Digital health [,]
Perceived profit (n=1, 0.3%)—Digital health [] Organizational aspects (n=3.68%) Non–face-to-face care models [], mHealth [,], AI [,], eHealth [,,,], digital health [,]Legal and regulatory aspects (n=10, 3.2%)Non–face-to-face care models [], mHealth [,], AI [], eHealth [,,], digital health [,,]
Privacy (n=6, 1.9%)—mHealth [,,,], AI [], digital health []
Transparency (n=4, 1.3%)—mHealth [,], AI [,]
Responsibility (n=1, 0.3%)—Digital health [] Description of health problem (n=8, 2.6%) Non–face-to-face care models [,], mHealth [], eHealth [], digital health [,]Content (n=5, 1.6%)mHealth [], eHealth [], digital health [,]
Information adequacy (n=2, 0.6%)—mHealth [], digital health []
Intervention adequacy (n=2, 0.6%)—Digital health []Technical aspects (n=4, 1.3%)AI [], eHealth [,], digital health []
Usability (n=10, 3.2%)—mHealth [,,], digital health [,,,,,]
Adaptability (n=8, 2.6%)—Digital health []
Adaptability (n=8, 2.6%)Interoperability (n=4, 1.3%)mHealth [,], digital health [,]
Adaptability (n=8, 2.6%)Scalability (n=2, 0.6%)mHealth [], AI []
Adaptability (n=8, 2.6%)Integration of data (n=1, 0.3%)mHealth []
Adaptability (n=8, 2.6%%)Transferability (n=1, 0.3%)eHealth []
Quality (n=5, 1.6%)—eHealth [], digital health [,,]
Design (n=5, 1.6%)—Digital health []
Design (n=5, 1.6%)Persuasive design (n=1, 0.3%)Digital health []
Technical stability (n=4, 1.3%)—mHealth: [,], digital health [,,]
Aesthetics (n=3, 1.0%)—mHealth [], digital health [,]
Ease of use (n=3, 1.0%)—mHealth [,], digital health []
Accessibility (n=2, 0.6%)—mHealth [], digital health []
Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)—Digital health [,]
Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Reliability (n=6, 1.9%)mHealth [,], digital health []
Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Validity (n=5, 1.6%)mHealth [,], AI []
Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Accuracy (n=2, 0.6%)Digital health []
Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Sensitivity (n=1, 0.3%)Digital health []
Feasibility (n=1, 0.3%)—Digital health []
Generalizability and reproducibility (n=1, 0.3%)—AI []
Interpretability (n=1, 0.3%)—AI []
Customization (n=1, 0.3%)—Digital health [] Postmarketing monitoring (n=3, 1%) mHealth [], digital health []

adHT: digital health technology.

bmHealth: mobile health.

cAI: artificial intelligence.

dN/A: not applicable.

Research Question 3: Variability of Domains and Dimensions Among Technologies

Our thematic analysis revealed a significant degree of variability and heterogeneity in the number and type of domains and dimensions considered by the methodological frameworks.

In terms of numbers, the variability was quite pronounced when we compared frameworks addressing different types of dHTs. For instance, the thematic analysis of frameworks for assessing telemedicine only identified 9 (75%) domains and 6 (15.8%) dimensions; instead, in frameworks for assessing mHealth, we identified 10 (83.3%) domains, 20 (52.6%) dimensions, and 6 (54.5%) subdimensions, and in frameworks for assessing AI, we identified 8 (66.7%) different domains, 7 (18.4%) different dimensions, and 6 (54.5%) subdimensions.

In terms of the types of domains considered, certain dimensions and domains were identified as more distinctive for one kind of dHT than for another. For instance, clinical efficacy and effectiveness, technical safety, economic evaluation, and user experience were relevant for the evaluation of models of nonpresential health care and mHealth but not for AI. In contrast, there were specific dimensions and domains of mHealth that were not considered in the evaluation of non–face-to-face health care or AI, such as postmarketing monitoring, scientific basis, technical evaluation and validation, user control and self-determination, accessibility, content and adequacy of information, and data interoperability and integration. Finally, specific methodological frameworks for the evaluation of AI included dimensions such as technical aspects, adoption, use, integration, generalizability, reproducibility, and interpretability, which were not considered in the evaluation of telemedicine or mHealth. In conclusion, greater clarity and structuring in the presentation of these ideas are required to facilitate their understanding and assimilation.

Proposal for Domains, Dimensions, and Subdimensions for dHTA

These findings led to the development of a proposed methodological framework for dHTA, which comprises domains, dimensions, and subdimensions. These evaluation items were established objectively based on thematically analyzed evidence, without incorporating the researcher’s perspective. Consequently, the proposal for domains, dimensions, and subdimensions emerged from the literature and represents the entirety of identified evaluation domains, dimensions, and subdimensions (n=61). presents a visual representation of the proposed framework comprising 12 domains, 38 dimensions, and their corresponding 11 subdimensions. Notably, the figure highlights certain domains, dimensions, and subdimensions that are particularly relevant to the evaluation of non–face-to-face care models, mHealth, and AI according to the evidence.

Figure 2. Proposed methodological framework for dHTA. aDimension identified as especially relevant for non–face-to-face care models; bdimension identified as especially relevant for mHealth; cdimension identified as especially relevant for AI; dHTA: digital health technology assessment. A higher-resolution version of this image is available as .
DiscussionPrincipal Findings

In recent years, the interest in digital health has increased significantly, giving rise to a myriad of available technologies. This has brought about a profound transformation in health care systems, fundamentally changing the provision and consumption of health care services []. However, despite these advancements, the shift toward digital health has been accompanied by challenges. One such challenge is the emergence of a plethora of short-lived implementations and an overwhelming diversity of digital tools, which has created a need for careful evaluation and analysis of the benefits and drawbacks of these technologies [].

In this context, our ScR aimed to identify the methodological frameworks used worldwide for the assessment of dHTs; determine what domains are considered; and generate, through a thematic analysis, a proposal for a methodological framework based on the most frequently described domains in the literature.

Throughout the ScR, we identified a total of 95 methodological frameworks and tools, of which 19 [,,-] were directly identified through a systematic search and 75 were indirectly identified through 4 SRs [-]. The difference in the number of methodological frameworks identified through the ScR and the 4 evidence synthesis documents [-] is attributed to the inclusion of keywords related to the concept of HTA in the search syntax, the exclusion of methodological frameworks published prior to 2011 during the screening process, and the differences in perspectives used for the development of this paper compared to the 4 evidence synthesis documents mentioned earlier. In this sense, these 4 documents [-] have analyzed methodological frameworks and tools aimed at evaluating digital health that have not been developed from an HTA perspective despite the authors analyzing them as such. For example, von Huben et al. [] included in their analysis the Consolidated Standards of Reporting Trials (CONSORT)-EHEALTH tool [], which aims to describe the information that should be reported in papers and reports that focus on evaluating web- and mHealth-based interventions; Koladas et al [] included the mobile health evidence reporting and assessment (mERA) checklist [], which aims to determine the information that should be reported in trials evaluating mHealth solutions; and the European mHealth Hub document [] includes the Isys Score, which is for cataloguing apps for smartphones.

However, as detailed in the Results section, some of the methodological frameworks identified through the ScR were characterized by the authors themselves as being specific for evaluating certain types of dHTs (eg, non–face-to-face care models, mHealth), presenting certain differences according to each typology. It is important to note that the differentiation among various types of dHTs, as described throughout this paper and commonly used in the field of digital health, cannot always be made in a precise and exclusive manner []. This is because a technology often can be classified in more than 1 category. For instance, an mHealth solution may use AI algorithms, while simultaneously being integrated into a non–face-to-face care model []. In this context, future research should consider using alternative taxonomies or classification methods that are based on the intended purpose of the technology, such as those proposed by NICE in the updated version of the Evidence Standards Framework [] or the new digital health interventions system classification put forward by WHO [].

After conducting a thematic analysis of the 26 included studies, we observed that various methodological frameworks include a set of evaluation items, referred to as domains, dimensions, or criteria. These items primarily focus on the safety; effectiveness; technical aspects; economic impact; and ethical, legal, and social consequences of dHTs. However, there is significant heterogeneity among these frameworks in terms of the way they refer to the evaluation items, the quantity and depth of their description, the degree of granularity, and the proposed evaluation methods, especially when comparing frameworks that focus on different types of dHTs. Despite this heterogeneity, most methodological frameworks consider evaluation items related to the 9 domains described by the HTA Core Model of EUnetHTA, while some frameworks propose additional evaluation elements, such as usability [,,,,,], privacy [-,,,], and technical stability [,,,,] among others. These findings are consistent with earlier research [,].

In addition, through the thematic analysis, the heterogeneity identified among the different methodological frameworks included in this ScR was reduced to a total of 61 analytical themes related to various evaluation elements that were arranged in a 3-level vertical hierarchy based on the evidence: level 1 (12 domains), level 2 (38 dimensions), and level 3 (11 subdimensions). At this point, it is pertinent to note that although from the researchers’ perspective, some dimensions could have been classified under different domains (eg, responsibility under ethical aspects) or seen as essential for other kinds of dHTs, an effort was made to maintain the highest degree of objectivity possible. It is for this reason that privacy issues were not described as essential for non–face-to-face care models and why the dimension of accessibility was categorized within the domains of human and sociocultural aspects and technical aspects. This categorization was made because some of the methodological frameworks analyzed associated it with sociocultural elements (eg, evaluating whether users with functional diversity can access the technology and have sufficient ability to use it as expected), while others linked it to technical elements (eg, adequacy of the elements, options, or accessibility functionalities that the system incorporates according to the target audience) [,].

The ScR and thematic analysis conducted in this study led to a proposal for a methodological framework for dHTA. This framework was further developed using additional methodologies, such as consensus workshops by the Agency for Health Quality and Assessment of Catalonia (AQuAS), in collaboration with all agencies of RedETS, commissioned by the Ministry of Health of Spain. The final framework is a specific methodological tool for the assessment of dHTs, aimed at describing the domains and dimensions to be considered in dHTA and defining the evidence standards that such technologies must meet based on their associated risk level. The proposed methodological framework enables the assessment of a wide range of dHTs, mainly those classified as medical devices according to the Regulation (EU) 2017/745 for medical devices [] and Regulation (EU) 2017/746 for in vitro diagnostic medical devices, although it can be adapted to assess dHTs not classified as medical devices []. Unlike existing frameworks, it establishes a clear link between the identified domains and dimensions and the evidence standards required for dHTs to meet. This approach will enhance the transparency and consistency of dHTAs and support evidence-based decision-making. The final document was published from November 2023 onward and is available on the RedETS website as well as on the main web page of AQuAS in the Spanish language []. From the first week of February, the respective websites have hosted an English version of this document [], which also is accessible in the INAHTA database. In addition, the Spanish and English versions of the document will be periodically reviewed and, if necessary, adapted to align with emerging technologies and changes in legislation.

Limitations

Although this ScR was conducted in accordance with the PRISMA-ScR guidelines () and following the recommendations of Peters et al [] and Pollock et al [], there were some limitations. First, the search incorporated a block of keywords related to the concept of HTA (see ) due to the perspective of our ScR, which may have limited the retrieval of some studies to meet the study objective. However, this limitation was compensated for by the analysis of the 3 SRs and the report of the working group on guidelines for the evaluation of mHealth solutions of the European mHealth Hub. Second, much of the literature related to HTA is gray literature and only published on the websites of the authoring agencies. Despite efforts to address this limitation through expert input and a comprehensive search of the websites of the world’s leading agencies, it is possible that certain studies were not identified. Third, the quality and limitations of the analysis conducted by the authors of methodological frameworks and tools included in SRs may have had an impact on the indirect thematic analysis. Therefore, it is possible that some data could have been omitted or not considered during this process. Fourth, the focus on dHTs encompassed within the 3 previously mentioned categories (mHealth, non–face-to-face care models, and medical devices that integrate AI) may have influenced the outcomes of the thematic analysis conducted. Fifth, only methodological frameworks written in Catalan, Spanish, and English were included.

Comparison With Prior Work

To the best of our knowledge, this is the first ScR to examine the methodological frameworks for dHTA, followed by a thematic analysis with the aim of proposing a new comprehensive framework that incorporates the existing literature in an objective manner and enables the assessment of various technologies included under the concept of digital health. In this sense, existing SRs and other evidence synthesis documents have only analyzed the literature and reported the results in a descriptive manner [,,,,,,]. Furthermore, this ScR also considered, in addition to scientific literature, gray literature identified by searching the websites of the agencies, thus covering some limitations of previous reviews []. Moreover, this review was carried out from the perspective of HTA, addressing a clear need expressed by HTA agencies [].

Future research should aim to identify what domains and dimensions are relevant at the different stages of the technology life cycle, to establish or develop a standardized set of outcomes for assessing or reporting each domain, and to evaluate the effectiveness and usefulness of the existing methodological frameworks for the different intended users [,]. Moreover, future research should aim to determine the specific evaluation criteria that ought to be considered based on the level of risk associated with different types of technologies [].

Conclusion

Our ScR revealed a total of 102 methodological frameworks and tools designed for evaluating dHTs, with 19 being directly identified through a systematic search and 83 through 4 evidence synthesis documents. Only 19 of all the identified frameworks were developed from the perspective of HTA. These frameworks vary in assessment items, structure, and specificity, and their proven usefulness in practice is scarce.

The thematic analysis of the 26 studies that met the inclusion criteria led to the identification and definition of 12 domains, 38 dimensions, and 11 subdimensions that should be considered when evaluating dHTs. Building on our results, a methodological framework for dHTA was proposed.

We acknowledge Benigno Rosón Calvo (Servicio Gallego de Salud [SERGAS]), Carme Carrion (Universitat Oberta de Catalunya [UOC]), Carlos A Molina Carrón (Dirección General de Salud Digital y Sistemas de Información para el SNS. Ministerio de Sanidad, Gobierno de España), Carme Pratdepadua (Fundació Tic Salut i Social [FTSS]), Celia Muñoz (Instituto Aragonés de Ciencias de la Salud [IACS]), David Pijoan (Biocat, BioRegió de Catalunya), Felip Miralles (Eurecat – Centre Tecnològic de Catalunya), Iñaki Guiterrez Ibarluzea (Osasun Teknologien Ebaluazioko Zerbitzua [Osteba]), Janet Puñal Riobóo (Unidad de Asesoramiento Científico-técnico [avalia-t], Agencia Gallega para la Gestión del Conocimiento en Salud [ACIS]), Jordi Piera-Jiménez (Àrea de Sistemes d’Informació del Servei Català de la Salut [CatSalut]), Juan Antonio Blasco (Evaluación de Tecnologías Sanitarias de Andalucía [AETSA]), Liliana Arroyo Moliner (Direcció General de Societat Digital, Departament d’Empresa i Treball de la Generalitat de Catalunya), Lilisbeth Perestelo-Perez (Servicio de Evaluación del Servicio Canario de la Salud [SESCS]), Lucía Prieto Remón (IACS), Marifé Lapeña (Dirección General de Salud Digital y Sistemas de Información para el SNS. Ministerio de Sanidad, Gobierno de España), Mario Cárdaba (Insituto de Salud Carlos III [ISCIII]), Montserrat Daban (Biocat, BioRegió de Catalunya), Montserrat Moharra Frances (Agència de Qualitat i Avaluació Sanitàries de Catalunya), and Oscar Solans (CatSalut) for reviewing the protocol of this scoping review (ScR) and the ScR.

This research was framed within the budget of the work plan of the Spanish Network of Health Technology Assessment Agencies, commissioned by the General Directorate of Common Portfolio of Services of the National Health System and Pharmacy.

JS-F and CM-P were responsible for conceptualization, methodology, formal analysis, investigation, data curation, writing—original draft, and visualization. RP-P handled conceptualization, methodology, formal analysis, investigation, resources, and writing—original draft. RMV-H handled conceptualization, writing—review and editing, supervision, and project administration.

None declared.

Edited by T Leung; submitted 03.05.23; peer-rev

留言 (0)

沒有登入
gif