Figuring out the quality of sources, that would separate good from bad science, took me a long time. I was sleep-deprived, had postpartum anxiety, and was scared of something I had not previously thought that much about before having kids.
The woman who had been a mentor to me in my career, with whom I had entrusted my life and that of my child through my pregnancy and birth, whispered in my ear a few minutes after my son was born to not let the hospital give him the hepatitis B vaccine. This was the first and last communication we had about vaccination, but filtered through my trust for her it planted a seed of doubt and convinced me at the time there was a good reason to avoid it. Her suggestion that I followed could have cost my son his life, but it took me almost two years to figure out why.
I set out on a path of sleepless nights to figure out what those reasons might be, and the internet can certainly deliver on that front. The problem was, most of what I was reading was misinformation designed to exploit my fears. I wasn’t yet familiar with the idea of confirmation bias or that search engines had a lot of first page hits for poor sources because of the way the search algorithms work.
It took me some time to figure out that Google is the confessional of our anxieties, the most commonly typed in fears get boosted, and higher quality links with more technical language tend to lag behind as we often don’t know what keywords will give us the answers we are looking for. If we think of the internet as a library, it would be like if register adjacent tabloids and flyers handed to you on the street by conspiracy theorists were filed in the most prominent shelf spaces amongst the reference books.
Quality information is available, but it can take some digging to get to it. As a result, I got a lot more scared before I learned how to better interpret the quality of what I was finding. I eventually learned to type whatever scared me alongside “+criticism” or “+debunked” if I wanted to find a different perspective on a topic. Doing that is how I stumbled upon Skeptical Raptor’s well-cited breakdowns of things I had been reading that had me all twisted up in knots. Learning to use Google Scholar for searches also helped to cut through the noise when I was looking for actual research.
I feel a lot of regret for the time I left my child vulnerable to vaccine-preventable diseases because of fear, and hopefully some tools I found along the way to help me evaluate sources might be helpful for other parents who find themselves facing the same stress.
The SMELL Test for bad science
In the modern media landscape misinformation is both a cottage industry and big business. If a site can get your clicks, and write something compelling enough to get you to share it so all your friends click too, that creates revenue based on ad space. Manufactured fear also sells supplements, books, movies, and non-evidence based treatments, so there are a number of market motivations to create new and scary sounding content. Which means we as consumers of media need to take extra steps to avoid being misled.
This test was created to help students figure out how to separate good from bad science (or fact from fiction):
- S stands for Source. Who is providing the information?
- M is for Motivation. Why are they telling me this?
- E represents Evidence. What evidence is provided for generalizations?
- L is for Logic. Do the facts logically compel the conclusions?
- L is for Left out. What’s missing that might change our interpretation of the information?
Learn about logical fallacies and bad science
There are logical fallacies that come up over and over again when reading things by people trying to scare you about vaccine safety. Once you become familiar with them, you’ll be able to spot when they are being used in the place of valid evidence-based arguments. Ask for citations, if none are provided, you can be relatively certain you are being manipulated or talking to someone who is misinformed and pushing bad science.
If you see a specific researcher or organization presenting ideas that support the scientific consensus being villainized, that is your tip-off that they have something to say inconvenient to the narrative you are reading, and are definitely worth checking out.
If citations are provided in support of a claim you will then need to vet them for quality. I will address how to do that further down. Sometimes you will be provided with an avalanche of citations that are only loosely related in an attempt to overwhelm you. That is a fallacious tactic called Gish Gallop.
People who have strong evidence to make their case generally refrain from doing that because it is a dirty debate technique that would hurt their credibility. Most Gish Gallop lists I’ve encountered are copied and pasted by people who don’t understand the contents of them. Luckily, there are also people who have patiently examined the studies in those lists, and often have explanations posted online of how the citations used are misleading, or even contradictory to the claims being made.
Appeal to Emotion – Manipulating an audience by triggering strong emotions, often used to distract from lack of data to support a claim. This is often combined with anecdotes. And what could be more emotional than how we feel about our kids?
Appeal to Nature – The idea that all natural things are inherently better in spite of many natural things like viruses being dangerous, is a commonly used fallacy. I personally think this is poorly applied when used in arguments against vaccination because vaccines work with our natural systems to achieve immunity in the same way the diseases they prevent do, with the only meaningful difference being you do not have to fall ill and survive to develop the antibodies to have resistance against them.
There is another naturalistic fallacy I have encountered used for diseases like measles, suggesting they are an important part of our evolutionary history. What people making that argument don’t seem to realize is many vaccine-preventable zoonotic diseases are really quite new on the timeline of evolutionary history.
It is thought measles didn’t emerge until about 1700 years ago. If someone thinks grains are too new to be digested properly but also tells you to embrace the idea of measles infections in your children, you can be fairly certain that they have a weak grasp of history and evolutionary biology. If you want to learn more about how diseases jumped from animals to humans I recommend checking out the book Beasts of the Earth: Animals, Humans, and Disease.
Anecdotal – Substituting personal perception of an event in the place of valid statistics. Often used to deny epidemiological data in anti-vaccine arguments. This fallacy is not used by accident, it is easier for our minds to latch onto one negative personally told story than to wrap our heads around the billions of positive experiences out there that only exist as data sets.
Middle Ground – Insisting there are two valid sides to every argument. This is also known as false balance. The middle ground has to be earned when the body of evidence leans 99.999% in one direction, there are not two valid sides.
False Cause – Misattributing cause from a perceived relationship between two things. This happens both with anecdotes that attribute unrelated things to vaccination like common childhood conditions that develop at the same rate in populations both vaccinated and not. I’ve also encountered it with the misinterpretation or misrepresentation of studies and links to studies that don’t support the claims being made.
Becoming familiar with Paracelsus’s First Law of Toxicology, the “dose makes the poison,” will be helpful in spotting this one. I’ve seen many studies cited in which the dosages studied were so high as to be meaningless in comparison to the claim being made. The dose always makes the poison. This fallacy was one of the first things that tipped me off something might be wrong about what I was reading online. When I decided to start trying to understand citations being used, I found that they often only matched in title keywords, and were often unrelated in their findings to what I had been reading.
I have never been able to figure out if those citations were used just to add an air of authority with the hope no one would click on them, or if the authors of what I was reading had even less of an understanding of what they said than I did as a mom with no previous training in study interpretation.
Burden of Proof – Turning the tables on you, when it is the job of the person making a claim to support it. This often manifests in the phrase, “Do your own research.”
Cherry Picking – You will see this often because studies that have findings in favor of anti-vaccine arguments are extremely rare, but when they do get published occasionally I can assure you they will not get presented in a way that examines how they fit into the body of data as a whole. They will instead be presented as a stand-alone argument in favor of a claim. Cherry picking also happens within quality studies to misrepresent what their data actually says. It is very important to remember no single study equals science. A scientific consensus is very difficult to come by, and when it is achieved, it would require an equal or greater amount of evidence to overturn. There is a lot of predatory pay for play journals out there who will accept submissions without proper peer review, so a study being in print means nothing about its quality.
Appeal to Authority – There is a number of ways this fallacy can be used, but in regards to claims of secret or cutting-edge knowledge about vaccination beware of people who claim to “work in the medical field” in a vague way. I have seen this description used by a surprising number of receptionists, billing specialists, drug store cashiers, massage therapists, and chiropractors with no training in clinical research or medicine.
Another way appeal to authority is used is the expert opinion, which is low on the hierarchy of evidence. Even if what you are reading was written by someone with an M.D. after their name, their opinion is not sufficient to overturn scientific consensus.
For a more in-depth explanation about logical fallacies check out these links.
Learn about cognitive biases and bad science
Cognitive biases are the things that prevent us from seeing the world how it is versus how we think it should be. They are shortcuts we rely on to survive in a world with too much data to think through every single thing we encounter in a day, but developing an awareness of them can help us perceive the world in more effective ways.
One of the primary cognitive biases I, and I think many people wrestle with when learning about a new topic, is when faced with too much information we are drawn to data that confirms our existing beliefs or confirmation bias. If we go into learning about a topic with the belief there is a reason to be scared, we will be more drawn to and more likely to remember scary stories about a topic.
The cognitive bias cheat sheet in this link is one of my favorite explanations of the cognitive biases we will encounter using human brains to interpret the world.
Web of Trust
The Web of Trust isn’t perfect because it is crowd-sourced, and people with strong feelings about things tend to group together and pile on both in positive and negative ways (there are some wooey sites with good ratings simply because they are popular with certain crowds), but I have found it helpful as an initial vetting system to get a feel for a site’s reputation. Also alerts you about websites that might be harmful to your computer or misuse your information.
Learn about bad science
One of the best tools I’ve found for being able to get a feel for what good science might look like is by studying bad science. If you are interested in learning how to interpret the quality of studies for yourself I highly recommend the book Bad Science: Quacks, Hacks, and Big Pharma Flacks by Ben Goldacre. It is a crash course in how to spot the most common things that lead to poorly designed studies. It is more interesting than it sounds, I promise. Here is a TED talk he gave if you want a taste of his writing style, and an introduction to what he goes over in the book.
Learn how to recognize good science
Learning how to read a study critically will be helpful. This is an easy-to-follow guide (pdf) I used to learn how to do this. It will be very time consuming at first, but is a skill worth developing, and will pay off in expediting your ability to sort the good from the bad.
What does good science look like when trying to find information that you can use to make informed decisions?
- One of the most important things is it demonstrates actual clinical, not just statistically significant findings.
- The conclusions and abstract should not be misleading compared to the data presented.
- Good science doesn’t start with a conclusion and then seek out to prove a preconceived notion through fishing expeditions and cherry picking. When you do encounter that you are witnessing someone engaging in pseudoscience. If a researcher has a reputation for being at odds with the scientific consensus on a topic, you might want to approach their work with extra scrutiny for bias and methodological errors.
- If there are a lot of strong statements in the abstract and introduction, or there is a lot of editorializing, colorful language, and opinion mixed in, that can also be a red flag.
- Good science is genuinely hypothesis-driven and created with the understanding that the results should be found to be reproducible by other researchers before being accepted with confidence. This is why you will often find the phrase “more research needed” at the end of a study.
- Researchers putting out high-quality work also tend to avoid succumbing to Galileo complexes and refrain from going off on self-published book tours taking every novel finding directly to the public to bypass peer review and announce to the world how cutting edge they are, like many fringe researchers do.
You will need more than an abstract to know what a study is actually saying. Papers behind paywalls are sometimes available directly from the researchers if you ask nicely. There are also tools you can use to search what is available for free like Unpaywall.
Not all journals are created equal, there are many potentially predatory journals out there, which might not have particularly robust peer review, some don’t even have it at all. It is important to remember not all open access journals are predatory though. I find this list to be useful if I haven’t heard of a journal before. https://beallslist.weebly.com/
Something you will encounter a lot is people trying to pass off low-level evidence sources as equal to or more credible than higher levels of evidence. A Youtube video or opinion paper by a researcher is not even in the same ballpark as a systematic review of the body of evidence. A cell study in a petri dish cannot be used to state what will happen in a person or invalidate the findings of higher level studies.
An example you’ve probably encountered of this if you spend any time on social media are essential oil sales pitches. Their claims are based almost exclusively on antimicrobial effects in a petri dish, and those studies are being grossly misused to suggest safety and efficacy in humans by skipping the many levels of research required to demonstrate both safety and a clinically significant effect. Anything from temperature, to soap, to bullets could have a similar effect on microbes at that level of research. Humans are extremely complex organisms, and the bulk of optimistic low-level findings in cell and animal studies don’t end up showing anything significant in further testing in humans.
The following infographic and link will help you rank and better understand the strength of the evidence you encounter.
Many anti-vaccine studies I have read have ultimately been retracted if they weren’t published in predatory pay to play journals. People sharing retracted studies with you are often not aware of the retraction or if they do know don’t volunteer that information because it would weaken their argument. And no, it isn’t because of some conspiracy to keep those researchers down, every single one I have encountered has been retracted because of poor methodology, failure to disclose conflicts of interest, and in the case of several researchers blatant data manipulation that was missed in the initial peer review process. See Retraction Watch for ongoing reviews of retracted bad science.
Pubpeer for bad science
This is where some of the fun happens behind the scenes. It was like a slow-motion thriller to watch data manipulation get discovered and shared in real time from a study that was retracted a few months later. If you see a study that strikes you as questionable, I’d suggest running the title through Pubpeer and see if it has been discussed
Ask for help
If you are given what looks like a legitimate claim or study, but it is over your head there are science communicators, epidemiologists, immunologists, clinical researchers, and biostatisticians out there who are excited to talk about research in their fields. If you are shy about contacting them directly, some are findable on social media if you start following science communication groups in areas you are curious about. They have the perspective to help you figure out how a study you have questions about would fit into the larger body of evidence and may have helpful information about the reputation of the study authors.
I know it can be difficult to tell who is an expert when you are first looking at things like this. I personally found that pseudoscientists and pseudo-experts who don’t have evidence on their side tend to become angry when you ask them questions because all they have to support their position is a narrative or low levels of evidence. If you are unsure about the level of expertise of who you are communicating with, I would suggest that you start asking questions, a lot of them.
I would like to give special thanks to the people who got angry and deleted my questions or kicked me out of groups for asking questions when I started to realize they were writing things at odds with the evidence I was starting to find. Who knows how much longer I might have approached this subject with an uncritical eye if my questions hadn’t resulted in being ostracized by the people I had been turning to for help.
This guest post was written by Brooke Fotheringham, mom, and photographer with a passion for science communication and public health.
Please help me out by sharing this article. Also, please comment below, whether it's positive or negative. Of course, if you find spelling errors, tell me!
There are three ways you can help me out. First, you can make a monthly (or even one-time) contribution through Patreon:Become a Patron!
Or you can help out through GoFundMe.
Buy ANYTHING from Amazon.