“Bad Advice” by Paul Offit – a book review by Dorit Rubinstein Reiss

Bad Advice

A new book, “Bad Advice: Or Why Celebrities, Politicians, and Activists Aren’t Your Best Source of Health Informationby Dr. Paul Offit, is different from his previous writings in two ways – much of it is autobiographical, with a lot of personal anecdotes, and it is about science communication rather than the actual science.

“Bad Advice” opens with a story of a 1997 TV interview Dr. Offit has, and how he bungled – by his account – a question about which vaccines children get, how many, and when. The story sets the tone for the book – it’s funny, it’s candid about what Dr. Offit did, in his view, wrong, and it offers sound advice for other science communicators.

To a large extent, this book was written for those engaged in science communication, and it is full of tips that can help every current or would-be science communicator.

What gives the book its charms are the anecdotes and the humor sprinkled throughout it, and its accessible and conversational tone, but I don’t think I can mirror that here without spoiling the effect – I think these are best enjoyed in context. So this review describes the content but does not capture what makes “Bad Advice” so much fun.

For full disclosure, I highly admire Dr. Offit, have sought his advice and help on many issues in my writing on and advocacy related to vaccines, and consider him a personal friend. I have also read a draft of the book and provided comments. 

Why Science Communication?

The first three chapters of “Bad Advice” provide important background by explaining why science communication is needed, and some of the obstacles to it. 

The first two chapters of the book set out what science is and what scientists do, and why their training and background make it difficult for them to be effective science communicators. Among the things covered – again, with a lot of humor, humility, and personal anecdotes – are that much of the scientific work is done alone, and much of what it requires makes people less, rather than more, suited to work with people. 

Dr. Offit discusses the fact that the scientific method trains scientists away from using absolute statements, but qualified statements can backfire when communicating about science; the challenge of reducing complex, nuanced reality into sound bites that work in a digital age; and more.

The next chapter analyzes why we need science communication, why people – however smart – may fall for misinformation. It looks at several natural, human features that make us easily wrong on scientific issues. “Bad Advice” also examines our difficulty identifying and assessing risks, the pull of celebrities as authority figures, even though they may not have the background to provide good information, and may, in fact, promote bad information (for example, Robert F. Kennedy Jr.  – revisited later in the book – constantly provides bad information about vaccines  ). The chapter also talks about other limits on the ability of humans to think rationally and the ways we acquire knowledge.

After thus setting the stage for why it’s important to engage in science communication and some of the challenges, Dr. Offit is ready for the next stage.

Good advice vs bad advice

Chapters 4 through 7 offer direct advice on communications through personal anecdotes of things that worked and things that didn’t in Dr. Offit’s over 20 years of doing it.

In chapter 4, Dr. Offit offers “some painful, hard-earned, and occasionally humorous lessons gleaned from personal experience” on communicating with the public. These range from the deeply practical (“be comfortable”) to the content based (“be sympathetic,” in the context of an eleven-year-old diagnosed with AIDS at the time when HIV was a death sentence, and “Don’t panic.

The facts are your safety net.”). But they’re invariably written as amusing anecdotes leading to a useful punchline. In one of the stories, Dr. Offit describes how he arrived at the famous “10,000 vaccines” quote that anti-vaccine activists like to misuse. The punchline? “You are going to say things that, although scientifically accurate, you will regret. It’s unavoidable.”

Chapter five addresses whether it’s appropriate for scientists to debate science deniers, using several examples. Dr. Offit’s recommendation is to avoid it, but he does provide three successful examples of such debates. His conclusion is that he, personally, is too angry and passionate on vaccine issues to successfully participate – because he annually sees children die from preventable diseases, “invariably, .. because parents have chosen not to vaccinate their children. And the reason they had made that choice was that they had read or heard bad information..”

Bad Advice ends with a recommendation that “debating the undebatable is worthwhile,” if, and only if, scientists can see the discussion as a teachable moment, and not focus on the people they are debating or the others in the room.

I’m not sure I agree, at least in terms of a televised debate. I think Dr. David Gorski said it well when he wrote:

…debating cranks doesn’t sway anyone, sharing the stage with a real scientist does unduly elevate the crank in the eyes of the public. Besides, whatever the seeming outcome of the debate, you can count on the crank to declare victory and his believers to agree. In any event, science isn’t decided by the metrics used to judge who “wins” a public debate, which rely more on rhetoric and cleverness rather than science to decide the outcome. Finally, such debates are not without risks. Although Julian Whitaker, for example, was terrible at it, other cranks are adept at the Gish Gallop, and an unprepared skeptic or scientist can be made to appear clueless in front of a crowd that is almost always packed with supporters of the crank, not the skeptic.

I think I agree with Dr. Offit’s initial position that agreeing to a debate is a bad idea.

Chapter six looks at the role of comedians in combating misinformation about science, focusing on vaccines – covering the Penn and Teller episode, Jimmy Kimmel, the Daily Show and the Colbert Report. And I’m really going to let you read that by yourselves. It’s fun.

Chapter seven looks at the ways the cinema helps or harms science communication. It opens by comparing two films about outbreaks – “Contagion,” that got the science right, and “Outbreak,” that did not. To give a flavor, when talking about “Outbreak,” Dr. Offit describes how a monkey carrying the harmful virus was caught, and the movie scientists had to “determine which antibodies are neutralizing the mutant virus, synthesize those antibodies, and make several liters of life-saving antisera. Assuming everything goes well, Hoffman’s task should take about a year. Cuba Gooding Jr. does it in a little less than a minute. (Now I understand why people are angry that we still don’t have an AIDS vaccine.).”

Nonetheless, Dr. Offit sees an important role for movies in science communications, and urge scientists to work with filmmakers to get it right.

Science communication in action – confronting the anti-vaccine movement:

The last part of the book uses the anti-vaccine movement as a story of the pitfalls and successes of science communication.

Chapter 8 of “Bad Advice” looks at how charismatic figures can promote anti-science misinformation. Although it covers several examples, the heart of the chapter is the case of Andrew Wakefield, the British doctor who promoted misinformation about MMR. Dr. Offit tells the dramatic story of Wakefield’s rise, the scientific literature that showed him wrong, and the discovery of his misdeeds, that led to his fall. He describes Wakefield’s situation today – thoroughly discredited, on par with other conspiracy theorists – through his participation in the infamous Conspirasea Cruise.  The end of the chapter examines different explanations for why Wakefield sticks to his original claims, years after they’ve been thoroughly disproven. I’ll let you find out yourselves. It’s not exactly flattering to Wakefield, though. 

Chapter 9 looks at the role of politicians in promoting anti-science misinformation, focusing on Dan Burton’s hearings that tried to make a case that vaccines cause autism (YouTube snippets of the hearings, out of context, are still used by anti-vaccine activists. Dr. Offit will give you a more comprehensive view). Dr. Offit also tells of his own experience in the hearing, and what he sees as errors committed because of his naiveté and inexperience. It’s half sad and half comical to read through both his preparation for the hearing, and the actual experience of Mr. Burton, who came into the hearing with a set conclusion and a set role he wanted Dr. Offit to play, trying to delegitimize him. 

Chapter 10 warns science communicators to expect a campaign of personal delegitimization and attacks, drawing on Dr. Offit’s own extensive experiences with anti-vaccine efforts to attack him. It goes from hateful emails, through lawsuits, to death threats. It’s painful but incredibly important for people who go into these areas to be prepared for the ugly reaction from misguided but passionate people on the other side, in all its extreme forms. 

Chapter 11 goes more deeply into Dr. Offit’s own reasons for entering the fray. It is very autobiographical (some of the events in it were described in some of Dr. Offit’s other books, but many will be new to readers), telling his career story – again, with lots of humor, more than a few lumps. This is to explain what motivates him to speak up, and to some degree, to counter the claims accusing him of having a conflict of interests because of his involvement in the creation of the rotavirus vaccine. It’s a powerful chapter.

Chapter 12 ends on an optimistic note, pointing out things that have improved in the war for science – the rise of science bloggers, the better attitude of the media. And in the epilogue, Dr. Offit ends with the March of Science, as an embodiment of the willingness of science supporters to fight back.

Takeaway

In this very autobiographical, often humorous, extremely candid and full of good advice book, Dr. Offit does a service to science communicators by telling them what worked, what didn’t, and some thoughts on what comes next. You may not always agree with his advice, but you are very likely to agree with large parts of it, think about much of it, and enjoy the way it’s delivered. It’s a very fast read, and worth reading and probably rereading. And rereading.





Please help me out by Tweeting out this article or posting it to your favorite Facebook group.

There are two ways you can help support this blog. First, you can use Patreon by clicking on the link below. It allows you to set up a monthly donation, which will go a long way to supporting the Skeptical Raptor
Become a Patron!


Finally, you can also purchase anything on Amazon, and a small portion of each purchase goes to this website. Just click below, and shop for everything.




Vaccine research – it doesn’t mean what the anti-vaxxers think it means

vaccine research

How many times have you read a comment from an anti-vaccine zealot along the lines of “do your research, vaccines are bad.” That comment seems to imply two things – that the anti-vaxxer believes they have done real vaccine research, and those on the science/medicine side have not done real vaccine research.

Typical of nearly every claim made by the anti-vaccine religion, this is another one where they understate how hard vaccine research really is while overstating their actual skills and experience in comprehending real scientific research. I suppose this is a perfect example of the Dunning-Kruger effect – a cognitive bias wherein people without a strong scientific background fail to recognize their actual ineptitude in the field and mistakenly overrate their knowledge and abilities as greater than it is.

On the other hand, I’ve done real scientific research and worked hard at it. Time to explain. Continue reading “Vaccine research – it doesn’t mean what the anti-vaxxers think it means”

How to prevent cancer in 12 easy steps – vaccines are critically important

how to prevent cancer

I have railed against pseudoscientific charlatans who claim that they have the easy way to prevent or cure cancer. Generally, these snake oil salesmen try to convince you that they have some miraculous food, supplement, spiritual energy, and on and on, that can either kill cancer in its tracks or keep them from even growing in your body. Of course, none of their claims are actually supported by robust science. On the other hand, real science has 12 evidence-based methods to actually prevent cancer.

But what about those memes that say that supplements prevent cancer? Nope, they don’t. And that’s been shown in study after study after study after study (yeah, I could go on for awhile).

What about avoiding GMO foods because they cause cancer? Again, studies show that GMO foods have no effect on cancers. Oh, one more thing – bananas don’t have tumor necrosis factor, and the yellow fruit can’t prevent or cure cancer (but that doesn’t mean that they aren’t delicious).

Despite the absolute lack of evidence that supplements, kale, bananas, or drinking the pure waters of a glacial fed stream (which may not be an option with climate change), there are only a few things that can be done to manage your overall risk of cancer.

How to prevent cancer has been codified by the World Health Organization’s  (WHO) International Agency for Research on Cancer (IARC) into 12 steps (no, not that debunked one) that are called the European Code Against Cancer.

Let’s look at cancer and how to prevent cancer.

Continue reading “How to prevent cancer in 12 easy steps – vaccines are critically important”

Hierarchy of scientific evidence – keys to scientific skepticism

hierarchy of scientific evidence

I am a scientific skeptic. It means that I pursue published scientific evidence to support or refute a scientific or medical principle. I am not a cynic, often conflated with skepticism. I don’t have an opinion about these ideas. Scientific skepticism depends on the quality and quantity of evidence that supports a scientific idea. And examining the hierarchy of scientific evidence can be helpful in deciding what is good data and what is bad. What can be used to form a conclusion, and what is useless.

That’s how science is done. And I use the hierarchy of scientific evidence to weigh the quality along with the quantity of evidence in reaching a conclusion. I am generally offended by those who push pseudoscience – they generally try to find evidence that supports their predetermined beliefs. That’s not science, that’s the opposite of good science.

Unfortunately, today’s world of instant news, with memes and 140 character analyses flying across social media, can be overwhelming. Sometimes we create an internal false balance,  assuming that headlines (often written to be clickbait) on one side are somehow equivalent to another side. So, we think there’s a scientific debate when there isn’t one.

I attempt to write detailed, thoughtful and nuanced articles about scientific ideas. I know they can be complex and long-winded, but I also know science is hard. It’s difficult. Sorry about that, but if it were so easy, everyone on the internet would be doing science. Unfortunately, there are too many people writing on the internet who think they are talking science, but they fail to differentiate between good and bad evidence.

But there is a way to make this easier. Not easy, just easier. This is my guide to amateur (and if I do a good job, professional) method to evaluating scientific research quality across the internet.

Continue reading “Hierarchy of scientific evidence – keys to scientific skepticism”

Science mistakes – debunking a trope loved by pseudoscience

science mistakes

I know I shouldn’t use the conspiracy theory fallacy when talking about the pseudoscience-pushing science deniers, who provide bread and butter of topics for skeptics. I keep observing the same ridiculous and insanely illogical arguments used in the same manner by all of the deniers, including the oft-repeated “science mistakes” trope. Honestly, I think the pseudoscience pusher meet annually in Sedona, Arizona, ground zero of woo, to discuss which trope they’re pushing this year.

The anti-vaccine zealots, creationists, anthropogenic global warming deniers, and whomever else pretends to use science to actually deny science frequently focus on this theme of “science mistakes.”  And then they produce a list of cherry-picked examples that “prove” that science is wrong (see Note 1). Of course, this indicates more of a misunderstanding of what is science and the history of science than it is a condemnation of science. But your typical science denier is probably not going to let facts get in the way of maintaining faith in their beliefs. So let’s deconstruct and discredit this “science mistakes” trope.

By the way, in my story, I admit that there are many “science mistakes,” so read on. Hopefully, it’s somewhat enlightening. Continue reading “Science mistakes – debunking a trope loved by pseudoscience”

Fake science about Star Trek accepted by predatory journals – anti-vaccine researchers happy

fake science

One of my pet peeves, of which there are many, is when a fake science paper is published by a low ranked journal and trumpeted as if it is Nobel Prize-worthy research. You can read about anti-vaccine fake science published in these journals from notorious anti-vaccine “researchers” like Shaw and Tomljenovic, Exley, and Shoenfeld.

One of my pet loves is Star Trek, all versions, all the time. In fact, I occasionally have secret conversations with my fellow Big Pharma shills about Star Trek, in which vaccines are never mentioned. I am a self-confessed Star Trek Nerd, who has watched almost every episode of Star Trek ST: TOS through the current Star Trek: Discovery (see Note 1).

So when I get the opportunity, falling into my lap, to combine Star Trek and the anti-vaccine nonsense, I am happier than a pregnant tribble. And when a fake science paper about the Star Trek universe gets accepted by low ranked predatory journals, ones that are beloved by pseudoscience adherents across the world, it’s what I live for. Continue reading “Fake science about Star Trek accepted by predatory journals – anti-vaccine researchers happy”

Oprah for President – another billionaire pseudoscience pusher looking for a job

Oprah for President

A few nights ago, Oprah Winfrey, billionaire media personality, gave a speech during a Hollywood award show, where fellow millionaires and billionaires get dressed up in ten thousand dollar gowns and tuxes to pat each other on the back. Within nanoseconds of her admittedly powerful speech, desperate liberals and Democrats were suddenly chanting “Oprah for President.”

Of course, Ms. Winfrey has sent some mixed messages as to whether she will run for president, but as I’ve long ago observed in politics, denials have all the value of “a bucket of warm piss.” But if she did decide to run, I get the feeling, from reading posts across social media, she’d move to head of the class of Democratic candidates for President of the  United States. She’d surpass more highly qualified progressive Democrats such as Elizabeth Warren and Kirsten Gillibrand, who both would get my unconditional support for president (as if anyone would care).

So, why am I commenting on potential presidential candidates two years before the election? I’m sure some of you readers are mumbling, “stick to science you dumb feathered dinosaur. That’s why I’m here.”

But kind madam, it is about science. And based on science, a push for “Oprah for President” will not get my support. Continue reading “Oprah for President – another billionaire pseudoscience pusher looking for a job”

Scientific consensus – collective opinion of scientists

scientific consensus

In the hierarchy of scientific principles, the scientific consensus – that is, the collective opinion and judgement of scientific experts in a particular field – is an important method to separate real scientific ideas and conclusions from pseudoscience, cargo cult science, and other beliefs.

I often discuss scientific theories which “are large bodies of work that are a culmination or a composite of the products of many contributors over time and are substantiated by vast bodies of converging evidence. They unify and synchronize the scientific community’s view and approach to a particular scientific field.”

A scientific theory is not a wild and arbitrary guess, but it is built upon a foundation of scientific knowledge that itself is based on evidence accumulated from data that resulted from scientific experimentation. A scientific theory is considered to be the highest scientific principle, something that is missed by many science deniers. In addition, a scientific consensus is formed by a similar method – the accumulation of evidence.

I have written frequently about the scientific consensus, because it is one of the most powerful pieces of evidence in a discussion about critical scientific issues of our day – evolution, climate change, vaccines, GMOs, and many areas of biomedical knowledge.

This tome has one goal – to clarify our understanding of the scientific consensus, and how we arrive at it. Through this information, maybe we all can see the power of it in determine what is real science and what are policy and cultural debates.

Continue reading “Scientific consensus – collective opinion of scientists”

Coffee health effects – what does the best science say

coffee health effects

Coffee is one of the most consumed beverages worldwide, with tea being number one. And as I have mentioned previously, I am an unrepentant coffee lover. Over the years, there have been a number of claims about coffee health effects, both positive and negative, many without any solid scientific evidence in support.

Claims about coffee health effects goes back centuries. These claims were often confusing and contradictory. How many “studies” have we read about that said drinking it was good for your heart. Or bad for your heart. Or it prevented cancer. Or it increased your risk of cancer.

Part of the confusion is that the popular press, with its strange dependence on false equivalence, often presents two contradictory scientific studies as equivalent, even if they aren’t. Well, we’re going to look at a powerful new study that examined health outcomes that can be related to coffee. Let’s see what they say. Continue reading “Coffee health effects – what does the best science say”

Vaccines cause diabetes – another myth refuted and debunked

vaccines cause diabetes

If you cruise around the internet, engaging with the antivaccination cult (not recommended), you will pick up on their standard tropes, lies, and other anti-science commentaries. One that has always bothered me, not because that it was a lie, but because I had enough evidence floating in my brain that I was wondering if it were true–that vaccines cause diabetes, especially the Type 1 version.

A lot of the vaccine deniers believe that vaccines cause a lot of everything and several claims that vaccines cause Type 1 diabetes (or here), based on little evidence. As far as I can tell, this myth is based on the “research” from  J. Barthelow Classen, M.D., who has pushed the idea that vaccines cause type 1 diabetes, through some magical process that has never been supported by other independent evidence.

In another example of the antivaccination world’s cherry picking evidence to support their a priori conclusions, they ignore the utter lack of plausibility supporting any link between vaccines and Type 1 diabetes. At best, Classen has cherry-picked statistics to support his predetermined conclusions, “comparing apples to oranges with health data from different countries, and misrepresenting studies to back his claim.”

Moreover, Classen seems to come to his beliefs based on population-wide correlations that rely on post hoc fallacies, rather than actually showing causality between vaccines and diabetes. It’s like finding that a 5% increase in consumption of Big Macs is correlated with Republican wins in elections. They may happen at the same time, but it would take a laughable series events to show any relationship.

Continue reading “Vaccines cause diabetes – another myth refuted and debunked”