Ill informed patients make poor health decisions
This is the age of the empowered patient. The expert patient. The informed patient. About time, too. But where do patients get their information from? Often not the most reliable sources. And the consequences of badly-informed health choices can be deadly.
Vaccination is reckoned to have saved more lives worldwide than any other medical intervention. But the massive public health benefits of vaccination have been eroded by waves of fake news. A recent article in Nature classed this as 'viral misinformation'.1 Trust in vaccines fell as a result of emotional contagion, triggered by the 1998 publication of Andrew Wakefield's paper in the Lancet linking the MMR vaccine with autism and bowel disease, and accelerated through digital channels.
MMR uptake dropped sharply in early 2000s and has never fully recovered. The number of unvaccinated children in the US has quadrupled since 2001, according to the Centers for Disease Control and Prevention. Epidemics of measles are occurring across Europe. Rates of vaccination for influenza, HPV, tetanus and polio have also declined.
Misinformation frequently starts with bad science, which may be aggravated if it is financially motivated. Wakefield continues to stoke fears from his new base in Texas, including a campaign to encourage vaccine refusal that led to a measles outbreak among Somali immigrants. Mischief may also stem from political opportunism. Opposition to vaccines is a rallying call of right-wing populist groups including the National Rally party in France and the Five Star party in Italy, while Donald Trump has repeatedly tweeted that MMR vaccine causes autism.
Another big influence is 'super-spreaders', who propagate misinformation (motivated by ignorance or malice) through social media. In August 2018, Russian-backed bots and trolls were found to be spreading mal-information on vaccines as a way to polarise society. There is now a worldwide 'anti-vaxxer' movement that is increasingly vocal, as noted by our colleague David Anderson who highlighted a successful ASA complaint against an anti-vaccination billboard in NZ. And on 1st November, the UK's Chief Medical Officer, Sally Davies, spoke on the BBC about the influence of social media in spreading vaccine myths.2
How can views that are so at odds with evidence gain traction? In the current era of fake news, sadly, there is mistrust of experts, and this extends to the medical establishment. When patients ask their doctor for health advice, they are hoping for simple answers they can act upon. But the truth is often complex and imponderable. Medical news is constantly evolving and often conflicting. Responsible scientists tend not to talk in absolutes - they couch their advice in probabilistic terms which they caveat with "More research is needed". For a lay person looking for certainty, lack of consensus indicates that nobody really knows, so any opinion is valid.
Most people don't have firm grasp of statistics, and it can be difficult to explain risk to patients in terms they can easily understand. How do you convey the risk of false positives when screening for breast cancer, or the fact that measuring PSA has no impact on surviving prostate cancer? Millions of people take statins to reduce their risk of dying prematurely, but how would they feel if they were told that 67 people need to be treated to reduce one death (i.e. 66 are treated for no benefit)? If the Q risk calculator indicates they have a 13% risk of cardiovascular disease over the next 10 years, what should they do with that information?
Most people don't read medical journals, and when clinical results filter down to the lay press the information is often distorted or simplified, because stories sell better than facts. But the majority of the public don't even read the mainstream media. They get their information from other sources of authority.
The two-step model of communication flow suggests that people place trust in their peers or opinion leaders, who could be friends, family, Facebook groups, websites, or celebrities. A recent survey found that 61% of patients publicly share health information, primarily through Facebook, and the figure rises to 94% among 'high influencers'.3 There is certainly no shortage of high-profile celebs willing to dispense medical advice. Gwyneth Paltrow has taken a lot of flak for spreading low-fact health information, though there are signs that her Goop franchise is veering away from making overt health claims.
While most celebrity health advice falls into the category of 'flawed but harmless', the newly-published autobiography by Tina Turner shows the good and bad ways that high-profile individuals can spread health information. Ms Turner describes how her life was saved by a kidney transplant from her husband, and she calls for more people to join the organ donor register. So far so good. Then she reveals the reason she needed a transplant was that she had swapped her antihypertensive treatment for homeopathy, which destroyed her kidneys. The take-home message from this could be 'homeopathy is useless', but it could equally be 'celebrities trust homeopathy as a viable alternative to conventional medicine'.
You can't entirely stop people from believing alternative facts and conspiracy theories, and you won't stop unscrupulous interests and 'truthless' politicians from exploiting the gullible. Doubt is easy to sow when the ground for mistrust is fertile. The tobacco companies fought the anti-smoking lobby by implying controversy when there was none. Fear and anger are the twin engines of social media, harnessed by tech companies, trolls and anti-vaxxers alike. The persisting belief in Creationism in the US demonstrates that faith is more powerful than facts. But if you define faith as "conviction in the absence of proof", faith extends to many health beliefs as well.
As a medical communication agency, we know that simply stating evidence doesn't change beliefs, either in healthcare professionals or in patients. People do not see their lives reflected in statistics, a point argued in this article in the Financial Times.4
It is easy to blame 'Dr Google' for medical misinformation, because health is one of the most popular search topics, and a lot of online information is inaccurate. But Google has now updated its algorithm (christened the Google Medic update) to force up the quality of YMYL (Your Money or Your Life) websites, which include health and wellness websites. Sites are now assessed for Expertise, Authoritativeness and Trustworthiness (EAT), and those that score poorly face a massive drop in ranking. We have used it to test a number of pharma industry-generated websites, which rate surprisingly badly.
The author of the article in Nature is a director of the Vaccine Confidence Project, which addresses false rumours and scares before they snowball. Targeted social media has also been used to combat misinformation. In Denmark, girls claiming to have been harmed by HPV vaccination led to a drop in vaccination rates from over 90% to under 20%, but a counter-information campaign led by social media succeeded in rebuilding confidence.
What else can we do to influence people's beliefs and protect them from irrational self-harm? It isn't just a matter of intelligence or education. Lots of smart and well-qualified people hold views that contradict medical orthodoxy. It's an issue of credibility and trust in sources of information that are evidence-based and independent. Popular TV programmes like Trust Me, I'm a Doctor set an example by taking a sceptical stance and presenting impartial advice that's easy to understand.
Openness and transparency build credibility. Education on vaccines should not come directly from vaccine manufacturers, but from external sources. People respond to debate, so start a conversation and invite all views, enabling false information to be refuted. Recruit 'benign social influencers' with the reach and credibility to spread truthful information that will improve people's health. Recognise that people have conflicting motivations: a loving mother may expose her child to the remote risk of disease to avoid the certain pain of an injection. Explain relative and absolute risk using simple narratives that people can relate to.
An informed audience is more likely to make smart choices that lead to better health outcomes. And that's got to be better for everyone.
1. The biggest pandemic risk? Viral misinformation. Nature, 16 October 2018. http://www.nature.com/articles/d41586-018-07034-4?WT.ec_id=NATURE-20181018&utm_source=nature_etoc&utm_medium=email&utm_campaign=20181018&sap-outbound-id=487E1639A317EF1BE046EECBB9535BA8E2AB3928
2. https://www.bbc.co.uk/news/health-45990874
3. WEGO HEALTH 2017 Behavioral intent survey. https://www.wegohealth.com
4. How to save statistics from the threat of populism. Financial Times, 21 October 2018. https://www.ft.com/content/ca491f18-d383-11e8-9a3c-5d5eac8f1ab4
©2019 Life Healthcare Communications
www.life-healthcare.com
Address:
Life Healthcare Communications
The Hernes Oak
North Street
Winkfield
Windsor
SL4 4SY