Facts. Alternative facts. Evidence. Statistics. Lies.
Facts. Alternative facts. Evidence. Statistics. Lies.
Author: Daniel Hardiman-McCartney MCOptom, Clinical Adviser and Martin Cordiner, Head of Research
Date: 27 February 2017
Although much in the news of late, arguments over information - and what it means - are far from new. For the many occasions in which we can lay our hands on some incontrovertible truth, there are plenty of others in which the evidence combines to give us a compelling likelihood, rather than something as simple as cause and effect.
Which brings us to optometry. How do you know what works? What happens when there is disagreement? What happens when there is disagreement over time from the same source? When is the right point to stop doing one thing and to start doing another? To, for example, stop correcting myopia and to start treating it?
Optometry Tomorrow 2017 will provide an update of the latest findings. At the same time, speakers will disagree and opposing evidence will be presented. But evidence is not an immoveable object, and the important thing is to ask some key questions of any information.
Firstly, where has it come from? Is any study being referenced? Can that study actually be found? Does it say what the person suggests it says? A lot of newspaper articles mention ‘a study’ without saying anything more about it, fewer still link to the work itself, which prevents us from investigating at even a surface level.
Secondly, what other information is available? Is this the only study out there? Literature reviews are crucial to ensure that we do not miss the wood for a conveniently corroborating tree. Systematic reviews, in which specific search terms to answer a specific question are outlined first, are even better in ensuring (as far as possible) that all of the relevant information is included. It is the failure to aggregate all the available information, to give the full context, which can lead to the unfortunate or wilful misrepresentation of findings or statistics.
Then, how much should we believe the different pieces of information that we have found? This might sound like the trickiest part, but there are tools to help dissect the methodology of a project, to analyse for possible problems of bias or confounding factors, and to see which conclusions stand up and which don’t.
It is helpful to ask questions about what you hear or read, or at least to ask where it’s coming from. The hierarchy of evidence is not perfect, but it’s a good start in deciding whether you think a claim, a statistic, indeed a ‘fact’, can be trusted.
This is what the peer-review process of a scientific journal aims to do (if not perfectly), dissecting someone’s argument to see if it can be broken with a bit of prodding, and it’s where the notion of a ‘hierarchy of evidence’ comes from. Not everything you read about science in a newspaper is wrong, it’s just that it will not have gone through as many peer and specialist review processes as the information found in a journal such as Ophthalmic and Physiological Optics. A newspaper may also end up moving away from nuance and towards misrepresentation in the translation from specialist to lay description.
Finally, is the evidence so compelling as to be the end of the debate? Should you start trying to prevent myopia or not? Again, there’s tools for that. The GRADE scale provides a mechanism to assess the quality of the available evidence and how compelling it is, in order to move to a clinical decision, which involves balancing the benefits, risks and burdens that relate to an intervention or treatment. And at that point, your practice is evidence-based.
Last year, a well known topical mast cell stabiliser claimed to start relieving symptoms in just two minutes. Interested to see whether this was due to the lubricant effect of the product or a change in our understanding of how mast cell stabilisers work, we looked up the published research quoted to substantiate the claim.
The research by Montan et al was based on the qualitative feedback of a sample of twelve patients compared against a placebo, the research itself well conducted and very interesting. However based on the sample size, more research would be useful, to have confidence in the two minute claim. Conversely there was a systematic review on mast stabilisers published in 2015, which reviewed 30 trials with a total of 4344 participants, and this is more typical of the magnitude of evidence required to convincingly give us the confidence to change clinical practice. The review found mast cell stabilisers were safe and effective in alleviating symptoms at 3, 7, 14 and 21 days after starting treatment, but did not attempt to establish whether they started working after two minutes, so cannot support or refute the two minute paper.
In short, it is helpful to ask questions about what you hear or read, or at least to ask where it’s coming from. The hierarchy of evidence is not perfect, but it’s a good start in deciding whether you think a claim, a statistic, indeed a ‘fact’, can be trusted. The College’s many resources in this area, from OPO and Evidence-based Syntheses to our Clinical Advisers and Clinical Management Guidelines, use these processes to help you decide what to recommend to the patient in front of you.
We’ll be producing some short materials about dissecting evidence for you to use at Optometry Tomorrow. While this blog has been unfairly low on promoting the great quality content at OT17, it has hopefully given you an approach to use in all the sessions you attend – where you can literally question everything that you hear.