The social media cancer misinformation conundrum

Key Points Researchers found that 30.5% of the social media articles they surveyed contained harmful information. They noted that erroneous information spreads more widely than data-driven information from reliable sources.

When patients arrive at their clinical appointments wielding content from social media about their cancer diagnosis and treatment options, there is a nearly 1 in 3 chance that they will be armed with harmful misinformation according to a new study appearing in JNCI: Journal of the National Cancer Institute (published online July 22, 2021. doi:10.1093/jnci/djab141).

Although some misinformation is harmless or easily corrected, this is not always the case; the researchers found that in many instances, patients come with a fervent hope that a flawed and potentially harmful, even deadly, unproven therapy will cure their disease. The researchers noted that compounding the problem, erroneous information spreads more widely than data-driven information from reliable sources.

The study authors stressed that this growing trend threatens public health and must be corrected quickly because it hinders the delivery of evidence-based medicine while possibly jeopardizing crucial patient-physician relationships.

Study Details

The researchers used web-scraping software to search for relevant keywords in popular English-language articles about the 4 most common cancers: breast, prostate, colorectal, and lung. Their investigation encompassed news articles and blog postings appearing on multiple platforms, including Facebook, Reddit, Twitter, and Pinterest, between January 2018 and December 2019. They noted the uniform resource locators (URLs) that keep tabs on positive visits, such as “up-votes” on Twitter and Pinterest, “comments” on Reddit and Facebook, and “votes” and “shares” on Facebook.

They then selected the 50 articles about each cancer type with the greatest total engagement (all platforms combined)—a total of 200 distinctive articles—for further study. Because Facebook represented the most engagements, its content was also analyzed independently.

To rate the content, the researchers identified and selected 2 National Comprehensive Cancer Network guideline panel members for each of the 4 cancers under investigation to serve as content experts. The content experts rated each article's primary medical claims with a 5-point Likert scale ([1] true, [2] mostly true, [3] a mixture of both true and false, [4] mostly false, or [5] false). Scores from both experts who assessed each article were combined, with a sum of 6 or greater classified as “misinformation.” National Comprehensive Cancer Network panelists also classified the primary medical claims as (1) certainly not harmful, (2) probably not harmful, (3) uncertain, (4) probably harmful, or (5) certainly harmful. Articles rated as “probably harmful” or “certainly harmful” by either or both reviewers were classified as “harmful.” The raters also used a checklist to record their reasons for classifying articles as “misinformation” or “harmful.”

The total and Facebook-specific engagements for the presence or absence of misinformation and/or harm were assessed with a 2-sample Wilcoxon rank-sum test.

Study Results

Of the 200 articles investigated in depth, 37.5% (n = 75) were from traditional news sources, including online versions of print/broadcast media; 41.5% (n = 83) were from digital-only nontraditional news sources; 1% (n = 2) were from personal blogs; 3% (n = 6) were from crowdfunding sites; and 17% (n = 34) were from medical journals.

The median number of article engagements was 1900, with 96.7% of the visits occurring through Facebook engagement. The number of visits to articles with misinformation was greater than the number of visits to factual articles (median, 2300 [interquartile range, 1200-4700] vs 1600 [interquartile range, 819-4700]; P = .05.)

The content experts reported that 32.5% of the articles reviewed included misinformation (Cohen's κ coefficient = 0.63; 95% CI, 0.5-0.77). Their reasons included an article title that differed from the text, statistics and data that differed from the conclusion (28.8%), overstatement of the strength of the evidence presented with weak evidence described as strong or vice versa (27.7%), and claims involving unproven therapies (26.7%).

The researchers found that 30.5% of the articles featured harmful information (Cohen's κ coefficient = 0.66; 95% CI, 0.52-0.8) because of the potential for causing harmful inaction that could lead to a delay or refusal of medical attention for treatable/curable conditions (31%) and economic harm, such as out-of-pocket financial costs from treatment and/or travel (27.7%). They also identified harmful action from potentially toxic effects of the recommended test or treatment (17%) and harmful interactions due to unstated medical interactions with the curative therapies (16.2%).

The researchers found that more than 75% of the articles containing misinformation (76.9%; 50 of 65) included harmful information. They also reported that articles containing misinformation spawned a higher number of total engagements than articles with evidence-based data (median, 2300 vs 1600; P = .05).

Study Interpretation

The researchers admitted that a limitation of the study was their inclusion of only the most popular English-language cancer articles. They also said that the data lacked important qualitative information. “But this was determined to be beyond the scope of this report,” says lead study author, Skyler B. Johnson, MD, of the Department of Radiation Oncology at the University of Utah School of Medicine in Salt Lake City, Utah. “We believe this is the first study to quantify the amount of cancer misinformation on social media.”

The researchers also stated that reviewer bias toward conventional cancer treatments is possible. “However, questions were structured to avoid stigmatization of nontraditional cancer treatments,” they wrote.

Wen-Ying Sylvia Chou, PhD, MPH, program director of the Health Communication and Informatics Research Branch at the National Cancer Institute in Bethesda, Maryland, says that the study is valuable because it sheds light and raises alarms on an urgent problem. “The study tests whether cancer-related information is credible or trustworthy. Their data do point to some alarming trends.”

Dr. Chou, who has been studying health information on social media for the past 5 years, recently edited a special issue of the American Journal of Public Health on health information, in which she coauthored an editorial on research priorities in this area (2020;110:S273-S275. doi:10.2105/AJPH.2020.305905). She says that the clinician's approach to misinformation is critical. “The first question clinicians should ask their patients, so they can understand where they are coming from, is where are they getting this information, and what do they think of it? I think it is very important for clinicians to not pass judgement until they get a little more data.”

“We are in a very turbulent time when many people are fearful and mistrusting of traditional sources of health information, which is reflective of our polarized society,” she continues. She says that these factors—plus some resulting apathy—can contribute to misinformation “silos” online and possibly sow doubt and cause patients to be less trusting of their clinicians and mainstream health organizations. “This is very dangerous because the spread of misinformation has the potential to make society even more divisive.”

Dr. Chou says that she hopes either the current study authors or others will pursue the sources of false information and ascertain their motivations for sharing inaccurate information. “For example, it's possible that they're promoting useless or harmful products.”

Dr. Johnson says that he became interested in the topic as a medical student after searching the internet when his wife was diagnosed with cancer. He said he recognized that much of the information he found was bogus, and he immediately developed empathy for patients in dire need of good news and, hopefully, miracle cures. “We believe the takeaway message from this study is that misinformation online is common and something that patients with cancer will likely encounter. We need to work together to identify ways to help patients navigate the information that they will find online. Patients need to know how to identify harmful misinformation and where to go to get accurate information.” image

Photo credit: Shutterstock/PRPicturesProduction

留言 (0)

沒有登入
gif