Content Validation of a Questionnaire to Measure Digital Competence of Nurses in Clinical Practice

Nurses globally are increasingly affected by digitalization, such as the everyday use of EHRs, on each working day.1–3 The digitalization of healthcare has brought numerous possibilities for utilizing digital technologies at work in nursing.4 These possibilities impact nursing interventions, such as the use of telehealth to reduce emergency admission for patients with chronic diseases.5 Furthermore, digital technologies are used to enhance administrative processes such as the implementation of EHRs.6

According to the technology acceptance model,7 the extent to which technologies are perceived as helpful and useful is determined by “perceived ease of use” and “perceived usefulness.” One determinant of “perceived ease of use” is “computer self-efficacy,” which is the perceived degree to which an individual thinks he/she has the ability to interact with a specific technology.7 There are different terms used for this ability, such as digital literacy, digital competence, digital skills, and nursing informatics competency.8,9 In this study, we use the term digital competence, because this term reflects the “need for a wider a more profound content of the concept.”8,p.2

In general, competence for occupations goes beyond knowledge and skills and comprises of cognitive competence (knowledge), functional competence (skills), and social competence (behavior, attitude).10 The combination of knowledge, skills, and attitude also applies to digital competence. A Delphi study commissioned by the European Commission's Joint Research Centre with 95 international experts to debate about the conceptualization of digital competence concluded that digital competence constitutes of knowledge, skills, and attitudes (Figure 1).11

F1FIGURE 1:

Framework of digital competence.

A review on health professionals' competence in digitalization further underlined the need for inclusion of attitude toward technology at work to the understanding of digital competence besides knowledge and skills.12 For digital competence among health professionals, attitude describes the feelings toward technology or the way of behaving when interacting with technology at work.12 Nonetheless, a definition for digital competence of nurses that comprises the three aspects is missing as the most used definitions rely on knowledge and skills if a definition was provided.13 Thus, we used the following generic definition, which resulted from the above-described project of the European Commission: “Digital competence is a combination of knowledge, skills and attitudes with regards to the use of technology to perform tasks, solve problems, communicate, manage information, collaborate, as well as to create and share content effectively, appropriately, securely, critically, creatively, independently and ethically.”14

Insufficient digital competence of health professionals has shown to be associated with a higher stress level induced by technology at work,15 which in turn can lead to higher burnout symptoms or lower job satisfaction among health professionals.16 Thus, nurses need adequate digital competence to use technologies appropriately and stay productive at work.1 This has been acknowledged internationally, and the American Association of Colleges of Nursing described “informatics and healthcare technologies” as part of the core competencies for nursing education.17 These core competencies describe the ability to identify suitable technologies and to use them accordingly. However, these competencies differ across the specific nursing roles, such as nurse managers, nurse informatics specialists, or nurses in clinical practice.4 Whereas nurse managers play a central role when it comes to strategic decisions regarding implementation of technology and allocation of financial resources, nurses in clinical practice use technology to control patients' health state, to secure patient-related information, and for interprofessional communication. Nurses in clinical practice use digital technologies for care planning and clinical reasoning, among other patient-related tasks.18

To improve the digital competence of nurses in clinical practice at work, nurse managers and those responsible for nurse training and further education need information about the nursing staffs' current digital competence level. A recent scoping review about the assessment of nursing digital competence summarizes 14 questionnaires between 2009 and 2019. The majority of questionnaires have included more than 50 items, and this hampers their usability due to time requirements.4 This is especially important for nurses in clinical practice as they have time constraints and often experience a heavy workload.19

Out of the 14 questionnaires, 10 questionnaires were found to only focus on the topic knowledge and skills.4 The missing inclusion of attitude in studies about the digital competence of nurses is problematic insofar as a positive attitude toward and good experience with technology are known to be crucial aspects for successful implementation and usage of technologies.12,20,21 For example, using the EHR means that nurses should know what happens to the data entered and what can be done with it (knowledge), and that they can open and close the program, edit the content, and communicate within the program (skills). Also, they are not reluctant to use the program for information exchange (attitude).

The remaining four identified questionnaires to measure nurses' digital competence from the scoping review included the topics knowledge, skills, and attitude.4 One of the four questionnaires was specifically developed for entry-level nursing students.22 The other three questionnaires are based on the Self-Assessment of Nursing Informatics Competencies Scale (SANICS). The Self-Assessment of Nursing Informatics Competencies Scale is based on a specific curriculum for wireless informatics for safe and evidence-based advanced practice nurse care and thus focuses additionally on questions about the usage of wireless devices at work.9 The focus may not be equally relevant for all nurses in clinical practice because digital maturity in healthcare differs internationally, and thus wireless devices are not regularly implemented.23,24 Furthermore, SANICS was developed for nursing students, and thus it also includes research and presentation skills,9 which do not reflect the top 10 core competency areas of clinical nursing from the Technology Informatics Guiding Education Reform.18 The Technology Informatics Guiding Education Reform is an international initiative from the Healthcare Information and Management Systems Society and aims to enable nurses and nursing students to successfully engage with technology at work.18

Therefore, a new brief questionnaire to measure nurses' digital competence in clinical practice is needed to overcome these drawbacks.

The guidelines in scale development by DeVellis25 are often used in scale development. This includes the following eight steps: (1) determine clearly what we want to measure, (2) generate an item pool, (3) determine the format of measurement, (4) have initial item pool reviewed by professionals with knowledge in the field, (5) consider inclusion of validation items, (6) administer items to a development sample, (7) evaluate the items, and (8) optimize scale length. The first four steps lead to an initial item pool that is rated as relevant by professionals with knowledge in the field, and this allows one to comment on content validity.25 Therefore, the aim of this study was to identify relevant items for an item pool to measure clinical practice nurses' digital competence comprising the three dimensions knowledge, skills, and attitude and to evaluate the items' content validity.

MATERIALS AND METHODS

A normative Delphi study26 was conducted to reach the content validity and to cover the three dimensions knowledge, skills, and attitude, derived from the initial item pool.27 The normative Delphi technique is a structured and iterative process with a series of surveys (rounds) in which individuals with knowledge in the respective field rate proposed theory-based items for their thematic relevance in order to reach a consensus about the relevant items that describe the theoretical construct.28

Preparatory Steps Before the Delphi Study

To prepare the Delphi study, the first three steps by DeVellis25 were processed. For a description of the construct “digital competence” as well as the identification of the initial item pool, we conducted a literature search to identify relevant literature that has not been included in two recently published reviews.4,12 The focus was on the identification of questionnaires for nursing digital competence4 and the definition of health professionals' digital competence.12 We based the keywords on both of the identified reviews: “skills, competency, literacy, knowledge, attitude, expertise, ability, know-how” AND “Healthcare Informatic Technology, computer, Information Computer Technology, informatics, medical technology” AND “nurs*, health professional, health care.” For the literature search, the databases Web of Science, MEDLINE, CINAHL, PubMed, and Google Scholar were used. Only articles in German and English were included. Articles about scale development or articles discussing definitions of digital competence for nurses or health professionals were included. Articles that only cited a definition of digital competence in healthcare or only used a questionnaire to measure nurses' digital competence were excluded after screening the respective references. In this study, we used the following description of nurses' digital competence for the development of the item pool: (1) A person must “have underlining knowledge, functional skills, and appropriate social behavior (eg, attitude) to be effective at work.”10 Thus, the digital competence of nurses in clinical practice is comprised of knowledge, skills, and attitude.12 (2) The research group developed 37 items based on the findings from the literature search and organized the items into the three categories (knowledge, skills, and attitude) in MS Excel with nine items for knowledge, nine items for skills, and 19 items for attitude. Whereas knowledge and skills have the commonality to measure something factual, such as knowledge of available information systems (knowledge) or the ability to save a file (skills), attitude describes feeling, beliefs, and behavioral intentions toward technology.29 Consequently, more items for attitude were included because “attitudes are multifaceted” and thus are challenging to measure.30,p.79 In particular, because the attitude dimension should be added to the digital competence construct and it can be broadly understood as to what it includes, a larger selection of items was made available to panelists.

The items were formulated broadly and inclusive for any kind of technology usage in order not to develop a questionnaire that would be too time-consuming for a nurse in daily practice. For example, instead of asking for specific skills and/or situations, such as “restarting the computer,” the research group developed an item that subsumes this case and comparable cases for error management with technology: “I know how to manage errors of digital technology.” Furthermore, the items were positively phrased, because negatively phrased items were found to be less reliable,31 as well as a combination of both negatively and positively phrased items.32 (3) The scale to measure clinical practice nurses' digital competence will use a 5-point Likert scale as a measuring format because it is the most common item format for measuring opinions, beliefs, and attitudes.25

Review of the Item Pool Study Sample

Panelists were recruited by contacting relevant international associations by email. These included the Canadian Nursing Informatics Association, American Medical Informatics Association's Nursing Informatics Working Group, and Schweizerische Interessengruppe Pflegeinformatik. The associations were asked to forward the invitation to their members for the participation of their networks by snowball sampling. The email included information about the study's aim and the invitation to participate in all rounds. Furthermore, potential panelists of the research group's network were contacted directly and invited to participate with the same invitation email. When a participant did not respond in one round, he/she was still eligible to participate in the subsequent rounds, as this was shown to lead to a more representative conclusion.33

Although participants in Delphi studies are often referred to as experts, it is recommended to refrain from labeling the participants as experts because having knowledge in a specific field does not imply having expertise.34 Thus, the inclusion criteria for the panelists were as follows: a completed training program as a medical or nurse informatics specialist, a digital manager working in a health organization, or a researcher with expertise in the field (eg, publications in the field). This was to ensure that different perspectives from research and practice were represented.34

Data Collection

We distributed the Delphi survey using the online survey UmfrageOnline (enuvo GmbH, Schwyz, Switzerland). The Delphi comprised as many rounds as needed to reduce the variance of the opinions so that they became more homogeneous.27 In the rounds, the panelists were asked to rate each item on a 4-point Likert scale (1, not relevant; 2, somewhat relevant; 3, quite relevant; 4, very relevant) as proposed by Polit and Beck.27 The panelists could also add comments as free text to suggest changes in the phrasing of the respective items or to suggest additional relevant topics as items for the upcoming rounds. After each round, all panelists received an online report on the results of the previous round regarding the item's relevance and the research team's decisions from the free-text data. The panelists could thus reproduce the decisions to exclude an item or to add new items.

Data Analysis: Content Validity Index

To quantify the panelists' consensus that all relevant questions would be asked to measure nurses' digital competence in clinical practice, an analysis for the content validity was conducted with the software R (RStudio, Boston, MA, USA).35 Content validity describes the item sampling adequacy, which means that high content validity exists if the item pool reflects the defined construct.25 In the current study, this is clinical practice nurses' digital competence. In each round, an item-level content validity index (I-CVI) score was calculated that evaluated the relevance of each item.27 The I-CVI is computed as the number of panelists giving a rating of quite relevant or very relevant in the rounds divided by the total number of panelists.27 A decision in favor of an item was made with a threshold of the I-CVI greater than 0.80. This means that the item would be excluded in the case of a CVI below 0.8. In addition, the average scale-level content validity index (S-CVI/Ave) was computed for the last round. This is the average of all I-CVIs. For acceptable content validity, an S-CVI/Ave of 0.90 or higher is expected.27

Data Analysis: Open-Ended Responses

The open-ended responses proposing new items or topics were clustered thematically and discussed in the research group until a consensus was reached on how to rephrase an item or if the suggestions resulted in an additional item.28

Ethical Considerations

This study is not covered by the Swiss Human Research Act. Accordingly, there is no approval from the responsible authority. Before the start of the study, the panelists received written information about the contents, the aim of the study, and the voluntary nature of their participation. They gave their informed consent by confirmation via the survey link. The data were anonymized during the data preparation process, and it did not allow tracing back to the panelists. The panelists had the option to stop their participation without giving a reason.

RESULTS

Overall, we conducted three rounds of this Delphi study between May 2020 and January 2021 in order to reach the requirements for a content-valid item pool to measure nurses' digital competence in clinical practice. Not all panelists participated in all three rounds (Figure 2).

F2FIGURE 2:

Responding panelists per round.

The characteristics of the participants are described in Table 1. The mean (SD) age of the panelists in all rounds was 45.5 (9.6) years, and the majority were male (n = 17, 59%). The majority of the panelists were from Switzerland (n = 15, 51.7%), followed by Germany (n = 6, 20.7%), Netherlands (n = 3, 10.3%), the United Kingdom (n = 3, 10.3%), Austria (n = 1, 3.4%), and Italy (n = 1, 3.4%). Regarding their profession, the majority of the panelists were nurse informatics specialists (n = 16, 55.2%), followed by digital managers (n = 8, 27.6%), researchers (n = 3, 10.3%), and medical informatics specialists (n = 2, 6.9%).

Table 1 - Sample Characteristics Characteristics Mean (SD) n (%) Age, y 45.5 (9.6) Sex  Female 17 (63) Country  Switzerland 13 (48.2)  Germany 6 (22.2)  Netherlands 3 (11.1)  United Kingdom 3 (11.1)  Austria 1 (3.7)  Italy 1 (3.7) Profession  Nurse informatics specialists 15 (55.6)  Digital managers 7 (25.9)  Researchers 3 (11.1)  Medical informatics specialists 2 (7.4)

The numbers of items in the Delphi study per round are summarized in Figure 3.

F3FIGURE 3:

Number of items in the Delphi study per round.

Round 1

The first round took place in May to June 2020 with 24 panelists, and three panelists did not respond (nonresponse, 12.5%). In the first round, the I-CVI of the 37 items (knowledge, 9; skills, 9; attitude, 19) ranged between 0.33 and 1.00, showing a high variance regarding the rated relevance across the initial items. Overall, nine out of the 37 initial items (knowledge, n = 4; skills, n = 2; attitude, n = 3) had an I-CVI below 0.8 and were therefore excluded from the second round (Table 2). The main comment of the panelists on items with a I-CVI below 0.8 was that these items were not relevant for nurses in clinical practice. For instance, the item “I am familiar with the digital technology activities in the world” was rated as not relevant because it was formulated too broadly, and the familiarity with digital technology at the workplace was seen as sufficient for nurses' clinical work. The item “I know how to manage errors of digital technology” was also rated as not relevant because the panelists expected that nurses in clinical practice contact IT support if errors with digital technologies occur at the workplace. The panelists provided eight additional items that were rephrased to meet the items' structure, and these were added in round 2. They included items such as “I am aware that patients themselves are increasingly using digital technologies to manage their symptoms” or “I am able to reach conclusions based upon information gathered on digital technologies.”

Table 2 - List of Excluded Items With the I-CVI Below 0.8 in All Three Rounds Items I-CVI Knowledge (n = 9)  I am confident in providing a definition of digital technology. 0.44  I am familiar with the digital technology activities in my country. 0.67  I am familiar with the digital technology activities in the world. 0.33  I am familiar with the current limitations of digital technologies. 0.71  I am aware that digital technologies can only assist me in the decision-making process. 0.39  It is clear to me why standardized comparable data are needed in the nursing profession. 0.72  I am aware that only a standardized nursing language offers the basis for evidence-based nursing development. 0.11  I am familiar with the digital technology activities employed by my organization. 0.71  I am familiar with the latest possibilities offered by digital technology at my workplace. 0.71 Skills (n = 3)  I know how to manage errors of digital technology. 0.59  I can support my team in the application of digital technologies. 0.75  I feel confident in advising my patients on the use of digital technologies to support their recovery. 0.65 Attitude (n = 5)  Innovation in digital technology should be a priority of the decision makers. 0.57  I would like to support the development of useful digital technology. 0.71  I would find new digital technologies easy for me. 0.57  Digital technologies promote the involvement of patients in documentation and treatment. 0.76  I use digital technology even if it is not mandatory. 0.67
Round 2

Round two was conducted during August to September 2020 with 21 panelists. Overall, six panelists did not respond in round 2 (nonresponse, 28.6%). In the second round, the I-CVI of the 36 items (knowledge, n = 12; skills, n = 8; attitude, n = 16) ranged between 0.11 and 1.00. In total, five (knowledge, n = 3; skills, n = 1; attitude, n = 1) out of the 36 items had an I-CVI below 0.8 and were therefore excluded from the third round. All five excluded items were added for round 2 based on the panelists' comments from round 1. The panelists' comments on the items with I-CVI below 0.8 were that they were too specific and more relevant for nurse informatics specialists. “The awareness that standardized comparable data are needed in the nursing profession” is an example of an item with an I-CVI below 0.8. The panelists' comments in round 2 resulted in the revision of five items in the skills topic for uniformity of the wording as follows: “I feel confident about using digital technology to […].” Five further comments concerned the similarity of three items focusing on “ethical,” “privacy,” and “confidentiality.” The research group decided to reduce the three items to one item in favor of the topic “confidentiality,” as proposed by the panelists. It is argued that “confidentiality” refers to the duty of anyone entrusted with health information to keep that information private, which is understood as ethical handling of electronical health information.

Round 3

Round 3 took place during December 2020 to January 2021 with 21 panelists. Overall, six panelists did not respond in round 3 (nonresponse, 28.6%). Among those who responded were two panelists, who did not participate in the second round. The questionnaire for the third round was composed of 29 items (knowledge, 6; skills, 8; attitude, 15). The I-CVI ranged between 0.67 and 1.00, showing a lower variance across the rated items than in the rounds before. Overall, three out of the 29 items had an I-CVI below 0.8 and were therefore excluded (Table 3). The S-CVI/Ave for the list of the final 26 items (knowledge, 4; skills, 8; attitude, 14) was 0.95 (SD, 0.07).

Table 3 - Final List of Items With the I-CVI Items I-CVI Knowledge (n = 4)  In general, I would rate my knowledge of digital technology as satisfactory. 1.00  I am familiar with digital technologies at my workplace. 0.95  I am familiar with the current laws and regulations pertaining to the protection and exchange of medical data (eg, data protection, informed consent, and confidentiality) at my workplace. 0.86  Patients use digital technologies to manage their symptoms themselves. 0.81 Skills (n = 8)  I feel confident in dealing with confidentiality issues relating to digital technology at my workplace. 1.00  I feel confident about using digital technology to share information. 1.00  I feel confident about using digital technology to obtain data and information on clinical care. 1.00  I am able to reach conclusions based on information acquired through digital technologies. 1.00  I feel confident about using digital technology to communicate. 0.90  I feel confident about the secure management of health data using digital technology. 0.90  I feel confident about using digital technology to find relevant information. 0.86  I feel confident about using digital technology. 0.81 Attitude (n = 14)  Digital technologies will make my day-to-day work easier. 1.00  I have an open attitude toward digital technology–related innovations at my workplace. 1.00  Digital technology fits well with the way I like to work. 1.00  I enjoy using digital technology at my workplace. 1.00  I encourage others to use digital technology in their professional practices. 1.00  I am willing to improve my ability to use digital technology through further training. 1.00  I believe that digital technology provides numerous benefits in terms of quality of care. 1.00  I believe that digital technology improves clinical care. 1.00  I believe that digital technology improves patient outcomes. 1.00  I believe that digital technology is beneficial for my patients. 1.00  I believe that digital technology is beneficial for health professionals. 1.00  I like to use digital technology at work. 0.90  I am keen to use new digital technologies in my future professional practices. 0.86  I believe that digital technology is relevant for my future profession. 0.81
DISCUSSION

The current study focused on the identification of a content-valid item pool as the basis for a future questionnaire measuring the digital competence of nurses in clinical practice. It is composed of the dimensions knowledge, skills, and attitude. The Delphi study resulted in 26 items (knowledge, 4; skills, 8; attitude, 14) derived from a 37-item pool with an acceptable S-CVI/Ave score above 0.9. We included proportionally more items (n = 19) on the topic attitude than on the other topics in order to have the expert panel evaluate their relevance, and the expert panel rated 14 out of them as relevant. This underlines the importance of attitude as one topic of digital competence.

The excluded items show the expert panels' consensus that nurses in clinical practice are not expected to solve problems with digital technologies themselves but to have them solved by the IT support. The panelists rated topics such as being familiar with the current limitations of digital technology or the ability to manage errors of digital technology as not relevant for clinical practice nurses. The low relevance for managing errors or solving problems with digital technology by nurses in clinical practice goes in line with the international recommendations of core competences for clinical nursing.18,36

The expert panel also rated the relevance of the items about the nurses' familiarity of available digital technologies in the world or supporting the development of useful digital technology as low. Hence, nurses working in clinical practice are not expected to have a comprehensive overview of potential available digital solutions or to support in the development of digitalization processes at work according to the panelists. This might be a problematic estimation across all nursing generations, because generation Z nurses (eg, digital natives), in particular, have a more comprehensive knowledge about the possibilities of technology and expect active usage of technology at work.37 Thus, they could contribute to finding innovative solutions at work. However, with this focus, technological innovation is limited to top-down, as nurses in clinical practice are not expected to engage with it. For the item pool, this could mean that future adaptions of it might be necessary to meet the ongoing changing role of nurses in clinical practice influenced by disruptive change and new abilities of younger generations within the field of technology usage at work.

The current 26 items, which were rated as relevant, present an item pool and not a final version of the questionnaire to assess digital competence. The next steps in the guidelines describe the process for conducting a factor analysis on this draft 26-item pool and testing it for internal consistency. The findings of these tests could result in item reduction and may lead to a brief valid and reliable questionnaire that can be used to assess digital competence.

Compared with other questionnaires measuring nurses' digital competence, the items in this questionnaire have a more general wording, which means that the items are not formulated to match specific skills but rather to an overarching competency. Whereas SANICS asks specifically for the ability to navigate the operating system Windows, for example,9 the present questionnaire broadly elaborates the ability to use digital technology. On one hand, SANICS is thus limited to the evaluation of technologies using a specific operating system. On the other hand, specific mentioning of an operating system implies that nurses know the corresponding operating system for each device they use. In the course of increasing change with regard to software and hardware in healthcare,1,2 a more general formulation of the items seems more timeless. Further comparison of the present questionnaire with other available questionnaires measuring nurses' digital competence should be conducted by evaluating concurrent and criterion validity in future research.

Strengths and Limitations

One strength of the study is the multistep Delphi method to achieve satisfactory content validity by involving different professions with knowledge in the field. Within the multistep procedure, the panelists can think about the topics' relevance for several rounds and reassess their initial ratings.26 The controlled feedback process between the research group and panelists made both the researchers' and panelists' decisions traceable.26

There are also limitations to be considered. Delphi studies are prone to bias such as investigator bias.38 We applied several methods to reduce the bias. First, we used an acknowledged and transparent analysis method by calculating the CVI. Second, we discussed open-ended responses in the research group in order not to overweight the opinion of a single person. Furthermore, we cannot exclude potential sampling bias of the panelists in this study because no evaluation of the panelists' underlying understanding of digital competence preceded, and we conducted a convenience sampling. Nonetheless, the panelists rated items from knowledge and skills as not relevant, indicating that all panelists found the items relevant, which described the nurses' attitude toward technology at work. Yet, it may be that other panelists would have rated differently. Furthermore, the study only allows us to interpret the content validity. For a questionnaire to be used in future research and practice, further validation for construct validity, concurrent validity, criterion validity, and internal consistency is needed. In addition, only four items from the item pool are meant to measure the dimension “knowledge,” which could be of concern that the items might insufficiently cover the dimension. However, in the first round, we proposed nine items for the dimension knowledge and the panelists rated five below the threshold and did propose additional items. A psychometric test of the item pool will show to what extent the four items cover the dimension and whether further items are needed.

CONCLUSIONS

This Delphi study led to a content-valid item pool as a basis for the development of a brief questionnaire for clinical practice nurses' digital competence. Further psychometric testing is needed before it can be applied. The study contributes to the discussion about the definition of nurses' digital competence by indicating that the nurses' attitude is seen as relevant in the context of digital competence.

Acknowledgment

We wish to thank the medical/nurse informatics specialists and researchers for participating in the Delphi study.

References 1. Booth RG, Strudwick G, McBride S, O'Connor S, Solano López AL. How the nursing profession should adapt for a digital future. BMJ. 2021;373: n1190. doi:10.1136/bmj.n1190. 2. Mather CA, Cummings E. Developing and sustaining digital professionalism: a model for assessing readiness of healthcare environments and capability of nurses. BMJ Health & Care Informatics. 2019;26(1): e100062. doi:10.1136/bmjhci-2019-100062. 3. Schenk E, Marks N, Hoffman K, Goss L. Four years later: examining nurse perceptions of electronic documentation over time. The Journal of Nursing Administration. 2021;51(1): 43–48. 4. Kleib M, Chauvette A, Furlong K, Nagle L, Slater L, McCloskey R. Approaches for defining and assessing nursing informatics competencies: a scoping review. JBI Evidence Synthesis. 2021;19(4): 794–841. doi:10.11124/JBIES-20-00100. 5. van Berkel C, Almond P, Hughes C, Smith M, Horsfield D, Duckworth H. Retrospective observational study of the impact on emergency admission of telehealth at scale delivered in community care in Liverpool, UK. BMJ Open. 2019;9(7): e028981. 6. De Groot K, De Veer AJE, Paans W, Francke AL. Use of electronic health records and standardized terminologies: a nationwide survey of nursing staff experiences. International Journal of Nursing Studies. 2020;104: 103523. doi:10.1016/j.ijnurstu.2020.103523. 7. Venkatesh V, Bala H. Technology acceptance model 3 and a research agenda on interventions. Decision Sciences. 2008;39(2): 273–315. doi:10.1111/j.1540-5915.2008.00192.x. 8. Ilomäki L, Kantosalo A, Lakkala M. What is digital competence? In: Linked portal. Brussels: European Schoolnet (EUN). 1–12. https://helda.helsinki.fi/handle/10138/154423. 9. Yoon S, Yen PY, Bakken S. Psychometric properties of the self-assessment of nursing informatics competencies scale. Studies in Health Technology and Informatics. 2009;146: 546–550. 10. Le Deist FD, Winterton J. What is competence? Human Resource Development International. 2005;8(1): 27–46. doi:10.1080/1367886042000338227. 11. Janssen J, Stoyanov S, Ferrari A, Punie Y, Pannekeet K, Sloep P. Experts' views on digital competence: commonalities and differences. Computers & Education. 2013;68: 473–481. 12. Konttila J, Siira H, Kyngäs H, et al. Healthcare professionals' competence in digitalisation: a systematic review. Journal of Clinical Nursing. 2019;28(5–6): 745–761. doi:10.1111/jocn.14710. 13. Longhini J, Rossettini G, Palese A. Digital health competencies among health care professionals: systematic review. Journal of Medical Internet Research. 2022;24(8): e36414. doi:10.2196/36414. 14. Skov A. What is digital competence?. Center for Digital Dannelse. https://digital-competence.eu/dc/en/front/what-is-digital-competence/ 15. La Torre G, Esposito A, Sciarra I, Chiappetta M. Definition, symptoms and risk of techno-stress: a systematic review. International Archives of Occupational and Environmental Health. 2019;92(1): 13–35. doi:10.1007/s00420-018-1352-1. 16. Golz C, Peter KA, Müller TJ, Mutschler J, Zwakhalen SMG, Hahn S. Technostress and digital competence among health professionals in Swiss psychiatric hospitals: cross-sectional study. JMIR Mental Health. 2021;8(11): e31408. doi:10.2196/31408. 17. American Association of Colleges of Nursing. The Essentials: Core Competencies for Professional Nursing Education. American Association of Colleges of Nursing; 2021. https://www.aacnnursing.org/Portals/42/AcademicNursing/pdf/Essentials-2021.pdf 18. Hübner U, Shaw T, Thye J, et al. Technology Informatics Guiding Education Reform—TIGER. Methods of Information in Medicine. 2018;57(suppl 01): e30–e42. doi:10.3414/ME17-01-0155. 19. Peter KA, Schols JMGA, Halfens RJG, Hahn S. Investigating work-related stress among health professionals at different hierarchical levels: a cross-sectional study. Nursing Open. 2020;7(4): 969–979. doi:10.1002/nop2.469. 20. Safi S, Thiessen T, Schmailzl KJ. Acceptance and resistance of new digital technologies in medicine: qualitative study. JMIR Research Protocols. 2018;7(12): e11072. doi:10.2196/11072.

留言 (0)

沒有登入
gif