Accepted on 28 Oct 2022 Submitted on 15 Sep 2022
IntroductionHealth information is one of the most sought-after internet topics on the internet in the USA and Europe [1]. YouTube (YT) is the world’s most widely-used video hosting site; it is the second-most popular website after Google [2]. Many users consider the internet a valuable and reliable source of health-related information; many consult online sources before seeking professional help [3]. This behavior has increased steadily in recent times, especially during the COVID-19 pandemic [4]. YT videos have the potential to influence the decision-making processes of patients and their relatives and affect how they manage their illness and their daily lives [5]. Therefore, as one of the most accessed websites and video-sharing platforms worldwide, YT is likely to spread misleading information to healthcare consumers, which can have devastating effects. It is extremely important to use appropriate tools to test and audit the information that is shared on this platform [6]. Unlike journal articles or guidelines, which are subject to intensive review and detailed evaluation, there is no review process for videos uploaded to YT.
The prevalence of noncommunicable diseases is on the increase worldwide. Among these diseases, neurological disorders, some of which have high global mortality and morbidity rates, occupy an important place in this increase [7]. However, there remains a dearth of studies that analyze the reliability of medical information on social media platforms, especially with regard to chronic neuropsychiatric conditions.
Internationally, the most common movement disorder has been shown to be essential tremor (ET). The symptoms of this disorder occur during voluntary movement and involve a type of tremor that affects the hands and arms at a frequency of 8 to 12 Hz. These tremors may also be accompanied by tremors of the head or voice. While the prevalence of ET sits at 0.9% among all age groups, its prevalence is increasing in individuals over 65 years of age [8]. As there is no specific biomarker, laboratory test, or imaging method for diagnosis of ET, medical history and neurological examination findings are generally relied upon for diagnosis [9].
Studies have been conducted to evaluate the quality of YT videos that address various neurological disorders, such as arteriovenous malformations, narcolepsy, hydrocephalus, peripheral neuropathy, and infantile spasms. Studies have found that video content related to neurological disorders such as these is of low quality [4]. No research has of yet been conducted regarding the accuracy, adequacy, or usefulness of the information contained in YT videos about ET. The lack of data on the reliability of the information on this common movement disorder presented in videos uploaded to social media platforms such as YT reveals a clear need for study in this area. Therefore, we have aimed to evaluate the quality, reliability, and usefulness of ET-related videos on YT through the use of quantitative instruments. Our aim is to use the information uncovered in this work to inform readers of the highest-quality videos on the platform so that patients and their relatives as well as healthcare professionals may access this content more easily.
Methods Data CollectionTo perform the video analysis, we accessed YT using “incognito mode” on Google Chrome; to eliminate its potential effect on search results, we deleted the search history on the computer we used. We did not use a personal account to perform searches. We ran the searches using the default “relevance” “sort by” setting. On March 21, 2022, we searched for videos typing the keywords “essential tremor,” “postural tremor,” “action tremor,” “essential tremor hand,” and “essential tremor head” into the YT search bar. It has been reported that 90% of viewers do not move past the first 30 videos in search results; therefore we chose to review only the top 30 videos for each search term [10]. We imported the videos into a database before we viewed and analyzed them to determine the information a normal user would view and to obtain more objective data. Videos longer than 1 hour, those published in a language other than English, those without audio, duplicates, and irrelevant videos (e.g., commercial and advertising videos, music videos) were excluded from the study. A flowchart demonstrating the video exclusion criteria is provided in Figure 1. Ethics committee approval was not obtained, as in previous studies in the literature, since no animal or human data was used in the study and all videos were publicly available.
Figure 1The flowchart of the selection of videos.
Scoring SystemThe video content was evaluated by two independent practitioners, each of whom had 5 years of experience using the DISCERN criteria and the Global Quality Scale (GQS). DISCERN is a rating scale that has been designed to evaluate the quality and reliability of a publication or broadcast. The DISCERN scoring system comprises 16 questions on a scale from 1 to 5, with 1 point indicating that the quality criteria are not met and 5 points representing complete compliance with all quality criteria. Item 16 relates to a holistic and concise assessment of the entire video (see Table 1) [11]. To determine the final DISCERN score, the scores obtained from the first 15 questions are summed; the videos are classified as follows: excellent (63–75 points), good (51–62 points), fair (39–50 points), poor (27–38 points), and very poor (15–26 points) [12]. Scoring in GQS ranges from 1 to 5, with 1 to 2 points indicating low video quality, 3 points moderate quality, and 4 to 5 points high quality [13].
Table 1
DISCERN items and scoring.
QUESTION NUMBER QUESTIONS SCORING 1 Are the aims clear? 1 2 3 4 5 2 Does it achieve its aims? 1 2 3 4 5 3 Is it relevant? 1 2 3 4 5 4 Is it clear what sources of information were used to compile the publication? 1 2 3 4 5 5 Is it clear when the information used or reported in the publication was produced? 1 2 3 4 5 6 Is it balanced and unbiased? 1 2 3 4 5 7 Does it provide details of additional sources of support and information? 1 2 3 4 5 8 Does it refer to areas of uncertainty? 1 2 3 4 5 9 Does it describe how each treatment works? 1 2 3 4 5 10 Does it describe the benefits of each treatment? 1 2 3 4 5 11 Does it describe the risks of each treatment? 1 2 3 4 5 12 Does it describe what would happen if no treatment is used? 1 2 3 4 5 13 Does it describe how the treatment choices affect overall quality of life? 1 2 3 4 5 14 Is it clear that there may be more than one possible treatment choice? 1 2 3 4 5 15 Does it provide support for shared decision-making? 1 2 3 4 5 16 Based on the answers to all of the above questions, rate the overall quality of the publication as a source of information about treatment choices. 1 2 3 4 5 Audience Engagement VariablesWe recorded quantitative data on video characteristics, including the total number of views, time elapsed since upload, video duration, number of comments, video power index (VPI), view ratio, like ratio, and number of likes and dislikes. VPI was calculated as (number of likes × number of views)/100, the view ratio was calculated as views after upload/days since upload, and the like ratio was calculated as number of likes/(number of likes + number of dislikes) × 100.
Qualitative VariablesDuring the analysis, the practitioners noted whether the videos contained clear information concerning ET, such as symptoms, etiology, epidemiology, diagnosis, treatment, treatment response, prognosis, diagrams, radiological images, and animations; they also noted whether the presenter was a doctor or a patient. We classified the upload source of each video into one of five categories: doctor, hospital, education channel, patient, or other.
Statistical MethodsWe used the mean, median, range, and standard deviation descriptive statistics for the continuous variables and we applied the Mann-Whitney U test to evaluate the categorical variables. We used inter-rater agreement to determine the intra-class correlation coefficient and Spearman’s rank correlation coefficient was used to establish the correlation between the variables. Values that had a P of less than 0.05 were considered to be statistically significant. All analyses were performed using SPSS v. 21.
ResultsEighty-three videos met the inclusion criteria for the study. For the most part, the videos that were analyzed had been produced by education channels (42.2%, 35 videos). The distribution of the upload sources of the remaining videos was as follows: hospital, 21 videos (25.3%); doctor, 20 videos (24.1%); patient, five videos (6%); and other, two videos (2.4%). Data on the video sources are shown in Figure 2. In 60 of the videos (72.28%), the presenter was a doctor, while in 23 of the videos (27.72%), the presenter was a patient.
Figure 2Source of uploaded videos.
Qualitative Video ContentWhile 75.9% (n = 63) of the videos addressed symptomatology, 74.7% (n = 62) actually provided diagnostic information. Videos that provided information about treatment and treatment response was determined to be 62.7% (n = 52) and 59.03% (n = 49), respectively, while those that provided clear information to viewers was 47% (n = 39). The proportion of videos that contained etiological and epidemiological data was 19.3% (n = 16) and 13.3% (n = 11), respectively. The qualitative characteristics of the content of the videos on ET are shown in Figure 3.
Figure 3Qualitative characteristics of video contents.
Quantitative Video ContentThe mean and range for the quantitative video metrics for all of the videos that were analyzed are as follows: number of views (mean, 48,634.21; range, 19–1,252,972), number of likes (mean, 327.10; range, 0–3,300), number of dislikes (mean, 12.46; range, 0–235), like ratio (mean, 91.34; range, 0–100), number of comments (mean, 54.97; range, 0–1,287), view ratio (mean, 28.08; range, 0.06–379.57), video duration (mean, 546.16 sec; range, 58–3,563 sec), number of days since upload (mean, 1,296.65; range, 25–3,530).
Video Quality MetricsThe mean DISCERN scores determined by the two raters were 41.95 ± 13.90 (range, 20–72) and 41.98 ± 14.04 (range, 19–72); the final mean DISCERN score was 41.96 ± 13.93 (range, 19–72). The mean GQS scores determined by the raters were 2.98 (range, 1–5) and 2.96 (range, 1–5). The mean DISCERN scores for item 16 were 2.90 (range, 1–5) and 2.91 (range, 1–5). The graph of DISCERN scores of both raters is shown in Figure 4. To ensure absolute agreement, we calculated the intra-class correlation coefficient between the two assessments; it was calculated as 0.9648 (95% confidence interval, 0.9468–0.9806) for single measures and 0.9344 (95% confidence interval, 0.9106–0.9602) for mean measures. This result indicates “excellent” reliability [14]. The mean DISCERN and GQS scores according to the video sources, education channel 42.82 and 3.05; hospital 50.19 and 3.71; doctor 38.10 and 2.75; patients 24.80 and 1.20; other sources were 21.50 and 1.22 respectively.
Figure 4Mean DISCERN scores of the raters.
When we categorized the videos by quality classification based on their total DISCERN scores, 38.6% (n = 32) were in the poor category, 20.5% (n = 17) were in the fair category, 19.3% (n = 16) were in the good category, and 9.6% (n = 8) were in the excellent category. Further to this, the videos with ranked qualitative parameters had a significantly higher DISCERN score than those without: clear information (p < 0.001), symptoms (p = 0.045), etiology (p < 0.001), diagnosis (p = 0.001), treatment (p < 0.001), treatment response (p < 0.001), epidemiology (p < 0.001), presence of diagrams (p < 0.001), and presence of radiological images (p < 0.001). The videos that contained these elements also had significantly higher GQS scores than the remaining videos: clear information (p < 0.001), symptoms (p = 0.001), etiology (p < 0.001), diagnosis (p < 0.001), treatment (p < 0.001), treatment response (p = 0.001), epidemiology (p = 0.001), prognosis (p = 0.033), presence of diagrams (p < 0.001), and presence of radiological images (p < 0.001). The relationship between the average scale scores of the videos with and without qualitative data is shown in Table 2. Finally, the mean DISCERN and GQS scores for videos presented by a physician were significantly higher than for those presented by a patient (p < 0.001 for both). In the correlation analysis, we found a significant positive correlation between the DISCERN and GQS scores (p < 0.001, r: 0.907).
Table 2
Statistically significant relationships and selected qualitative video content.
GROUP COUNT MEAN DISCERN STD P MEAN GQS STD P Clear information YesVideos with ranked qualitative attributes had significantly higher audience engagement parameters than those without. The videos containing information on symptoms had significantly higher views (p = 0.034), while those containing information on etiology had a significantly higher like ratio (p = 0.023). We did not observe any other significant relationships between qualitative video features and audience engagement parameters, nor did we find a significant relationship between the presenter being a doctor or a patient and audience engagement parameters.
Between mean DISCERN scores and audience engagement parameters, we found the total number of views (p = 0.048, r: –0.217), view rate (p = 0.048, r: –0.217), number of comments (p = 0.017), r: –0.260), like ratio (p = 0.03, r: –0.239), and VPI (p = 0.032, r: –0.235) to be negatively correlated. There was also a significant negative correlation between the mean GQS score and the like ratio (p = 0.03, r: –0.239). The data of the correlation analysis are shown in Table 3.
Table 3
Correlation analysis.
VPI LIKE RATIO TOTAL VIEWS VIEW RATE TOTAL COMMENTS MEAN GQS Mean DISCERN –0.235* –0.239* –0.217* –0.217* –0.260* 0.907** Mean GQS –0.139 –0.239* –0.121 –0.123 –0.194 1.000Spearman’s rho.
* Statistically significant at p < 0.05.
** Statistically significant at p < 0.01.
According to the DISCERN and GQS criteria, eight (9.6%) of the 83 analyzed videos were of excellent quality. The mean DISCERN score of these eight videos was 67.5, and their GQS score (for each of the eight videos) was 5. The mean VPI of these eight videos was 21.33. Table 4 presents further data on these eight top-quality videos.
Table 4
The Top eight Most Popular Essential Tremor Videos.
DiscussionIn this study, we found that the overall quality and reliability of ET-related videos on YT was poor and almost half of the analyzed videos were in the poor and very poor category, which can be a detriment for patients who wish to gain medical insight and increase their knowledge about and control over their disease. Therefore, we consider that completely reliable information is not available to patients who seek information about ET on YT or for healthcare professionals who may want to use these videos for educational purposes. This research on this disorder that widely affects the global population is the first such study in the literature in the field of medical informatics.
Systematic review studies have revealed that there are many biased or incorrect health-related videos on YT [15]. One Canadian study reported that one-third of the adult population will use the internet to search for health-related information. It has also been found that more than one-third of those who visited a hospital due to a health problem discussed the information they encountered during their internet searches with their doctors [16]. Damage caused by misleading information can reach dangerous levels, and given that the majority of patients believe online information to be reliable, incorrect online medical information can lead to serious unforeseen problems [4, 17].
A review of previous literature on YT content related to neurological disorders reveals that the content quality of YT videos about arteriovenous malformations, narcolepsy, and hydrocephalus was low [4]. On the other hand, one study reported that YT videos on stroke were of good quality according to the DISCERN criteria [18]. In addition, it has been reported that the accuracy rate of online information in studies where online content regarding conditions such as peripheral vertigo maneuvers, epilepsy, and movement disorders is analyzed varies between 30% and 60% [19, 20, 21]. Another study that examined Parkinson’s disease on YT revealed that only 19% of the videos were found to be useful or very useful [22]. The low quality of videos published on these neurological disorders, many of which affect a significant part of the global population and can have high mortality and morbidity, raises serious concerns.
Most of the videos in our study did not cite an information source or share when the information presented was produced, nor did they mention treatment risks, alternative treatment options, or how treatment might affect quality of life. However, we found that videos that did provide clear information contained information on symptomatology, treatment, response to treatment, and prognosis; contained etiological and epidemiological data; were enriched with diagrams and radiological images; and any animations were of higher quality in terms of content. Previous YT analysis studies on neurological diseases such as stroke and hydrocephalus have obtained similar results [18, 23].
In this study, we observed that when videos with the highest VPIs were classified according to their mean DISCERN score, these videos fell into the “poor” and “very poor” categories. In addition, our correlation analysis uncovered a significant negative correlation between the mean DISCERN and GQS scores and audience engagement parameters (VPI, like ratio, etc.). These findings show that the most popular YT videos on ET are of low educational quality and may be misleading; therefore, using these videos for educational purposes could lead to shortcomings in viewers’ knowledge. The aim of those who upload videos with the highest VPIs may be motivated by commercial concerns to create content that will increase the number of views rather than to provide high-quality and essential information for the viewer. Studies have shown that videos that contain misleading content tend to achieve more audience engagement than videos that contain reliable content [24, 25, 26]. There are currently no obstacles in place that limit who can who can upload and share videos on YT. This may be one of the main reasons that many popular YT videos are of poor quality.
The first source of reference for education should be medical literature obtained from medical publications and guidelines. However, it is not always possible for a person who is not a healthcare professional to easily access scientific data. Furthermore, many people lack the search or evaluation skills necessary to access reliable information on online platforms. It is important, therefore, to raise the quality standards of medical videos uploaded to YT. Videos that contain quality educational content are undoubtedly more beneficial to patients seeking information and to the training of healthcare professionals, and they should theoretically receive more appreciation and interaction from viewers. As short, relevant videos are ranked higher by YT [27], content creators who aim to present accurate and useful data—such as hospitals, education channels, and doctors—should focus on producing short, concise, targeted videos that highlight important points, as this will enable them to reach a wider audience. Although the majority of the video uploaders in our study were educational channels and hospitals, and the majority of the narrators were doctors, the low quality of the content may be due to the competitive attitudes of doctors and medical channels that have become popular on social media. Popularity may have overtaken scientificity.
In this study, the two individuals who evaluated video quality were general practitioners (i.e., they did not specialize in neurology or movement disorders). Although this prevented bias, it may also have introduced a problem in terms of easy evaluation of some specific subjects (e.g., deep brain stimulation, radiological images, etc.). Another limitation of this research is that videos that were produced in languages besides English were not included in the study. Finally, we performed the searches at a single point in time; given that YT video content is constantly changing, this is an inevitable limitation that may affect all studies in this area.
In this study, we showed that the content quality of ET videos on YT is generally not high. While we did list the best-quality videos on ET as a reference for healthcare practitioners and patients, we do not suggest that medical videos should be used as a guide for diagnosis or treatment. It would be more appropriate for medical videos on YT to highlight informative features and direct followers to seek professional help. Patients should always verify information provided in online videos with more reliable sources of information. Since the world of the internet and YT are dynamic, it may be useful to repeat this study in a few years to assess whether there has been a change in the content quality of videos. Finally, those who seek to provide video content should ensure that it is optimized to attract the attention of viewers while presenting higher-quality information.
Competing InterestsThe authors have no competing interests to declare.
ReferencesAmazon. youtube.com Competitive analysis, marketing mix and traffic. August 2019, https://www.alexa.com/siteinfo/youtube.com.
Pew Internet American Life Project. Internet visits soaring. Health Manag Technol. 2003; 24: 2–8.
McMullan M. Patients using the Internet to obtain health information: How this affects the patient-health Professional relationship. Patient Educ Couns. 2006; 63: 24–28. DOI: https://doi.org/10.1016/j.pec.2005.10.006
Krakowiak M, Szmuda T, Fercho J, Ali S, Maliszewska Z, Słoniewski P. YouTube as a source of information for arteriovenous malformations: A content-quality and optimization analysis. Clin Neurol Neurosurg. 2021; 207: 106723. DOI: https://doi.org/10.1016/j.clineuro.2021.106723
Szmuda T, Talha SM, Singh A, Ali S, Słoniewski P. YouTube as a source of patient information for meningitis: A content-quality and audience engagement analysis. Clin Neurol Neurosurg. 2021; 202: 106483. DOI: https://doi.org/10.1016/j.clineuro.2021.106483
Chalil K, Rivera-Rodriguez J, Greenstein JS, Gramopadhye AK. Healthcare information onYouTube: a systematic review. Health Informatics J. 2015; 21: 173–194. DOI: https://doi.org/10.1177/1460458213512220
GBD 2015 DALYs and HALE Collaborators. Global, regional, and national disability-adjusted life-years (DALYs) for 315 diseases and injuries and healthy life expectancy (HALE), 1990–2015: a systematic analysis for the Global Burden of Disease Study 2015. Lancet. 2016; 388: 1603–58. DOI: https://doi.org/10.1016/S0140-6736(16)31460-X
Louis ED, Ferreira JJ. How common is the most common adult movement disorder? Update on the worldwide prevalence of essential tremor. Mov Disord. 2010; 25: 534–541. DOI: https://doi.org/10.1002/mds.22838
Habipoglu Y, Alpua M, Bilkay C, Turkel Y, Dag E. Autonomic dysfunction in patients with essential tremor. Neurol Sci. 2017; 38(2): 265–269. DOI: https://doi.org/10.1007/s10072-016-2754-z
iProspect Search Engine User BehaviourStudy. Available at: www.iprospect.com. 2006; 17.
DISCERN-The DISCERN Instrument. Available at: http://www.discern.org.uk/discern_instrument.php. Accessed 2020.
Cassidy JT, Baker JF. Orthopaedic patient information on the World Wide Web: an essential review. J Bone Joint Surg Am. 2016; 98: 325–338. DOI: https://doi.org/10.2106/JBJS.N.01189
Singh AG, Singh S, Singh PP. YouTube for information on rheumatoid arthritis—a wakeup call? J Rheumatol. 2012; 39(5): 899–903. DOI: https://doi.org/10.3899/jrheum.111114
Koo TK, Li MY. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J. Chiropr. Med. 2016; 15: 155–163. DOI: https://doi.org/10.1016/j.jcm.2016.02.012
Madathil KC, Rivera-Rodriguez AJ, Greenstein JS, Gramopadhye AK. Healthcare information onYouTube: a systematic review. Health Informatics J. 2015; 21: 173–194. DOI: https://doi.org/10.1177/1460458213512220
Underhill C, Mckeown L. Getting a second opinion: health information and the Internet. Health Rep. 2008; 19: 65–69.
Swire-Thompson B, Lazer D. Public health and online misinformation: challenges and recommendations. Annu. Rev. Public Health. 2020; 41: 433–451. DOI: https://doi.org/10.1146/annurev-publhealth-040119-094127
Szmuda T, Alkhater A, Albrahim M, Alquraya E, Ali S, Al Dunquwah R, Słoniewski P. YouTube as a source of patient information for stroke: a content-quality and an audience engagement analysis. J. Stroke Cerebrovasc. Dis. 2020; 29: 105065. DOI: https://doi.org/10.1016/j.jstrokecerebrovasdis.2020.105065
Brna PM, Dooley JM, Esser MJ, Perry MS, Gordon KE. Are YouTube seizure videos misleading? Neurologists do not always agree. Epilepsy Behav. 2013; 29: 305–307. DOI: https://doi.org/10.1016/j.yebeh.2013.08.003
Kerber KA, Burke JF, Skolarus LE, Callaghan BC, Fife TD, Baloh RW, et al. A prescription for the Epley maneuver: www.youtube.com? Neurology. 2012; 79: 376–380. DOI: https://doi.org/10.1212/WNL.0b013e3182604533
Stamelou M, Edwards MJ, Espay AJ. Movement disorders on YouTube–caveat spectator. N Engl J Med. 2011; 365: 1160–1161. DOI: https://doi.org/10.1056/NEJMc1107673
Al-Busaidi IS, Anderson TJ, Alamri Y. Qualitative analysis of Parkinson’s disease information on social media: the case of YouTube™. EPMA J. 2017; 23: 273–277. DOI: https://doi.org/10.1007/s13167-017-0113-7
Szmuda T, Rosvall P, Hetzger TV, Ali S, Słoniewski P. YouTube as a source of patient information for hydrocephalus: a content-quality and optimization analysis. World Neurosurg. 2020; 138: 469–477. DOI: https://doi.org/10.1016/j.wneu.2020.02.149
Kim R, Park HY, Kim HJ, Kim A, Jang MH, Jeon B. Dry facts are not always inviting: a content analysis of Korean videos regarding Parkinson’s disease on YouTube. J Clin Neurosci. 2017; 46: 167–70. DOI: https://doi.org/10.1016/j.jocn.2017.09.001
Garg N, Venkatraman A, Pandey A, Kumar N. YouTube as a source of information on dialysis: a content analysis. Nephrology (Carlton). 2015; 20: 315–320. DOI: https://doi.org/10.1111/nep.12397
Desai T, Shariff A, Dhingra V, Minhas D, Eure M, Kats M. Is content really king? An objective analysis of the public’s response to medical videos on YouTube. PLoS One. 2013; 8: e82469. DOI: https://doi.org/10.1371/journal.pone.0082469
Gupta HV, Lee RW, Raina SK, Behrle BL, Hinduja A, Mittal MK. Analysis of youtube as a source of information for peripheral neuropathy. Muscle Nerve. 2016; 53(1): 27–31. DOI: https://doi.org/10.1002/mus.24916
留言 (0)