Courts and science – talc and glyphosate probably do not cause cancer

courts and science

I’ve written about this many times before – courts do not get to decide what is good or bad science. Although courts and science may not necessarily be incompatible, attorneys, juries, and judges are generally not trained in scientific research, scientific methods, scientific publications, and/or scientific reasoning.

Two relatively recent cases are strong evidence that courts and science can be quite incompatible. In the first case, a jury ordered pharmaceutical giant Johnson and Johnson (JNJ) to pay US$4.69 billion in damages to 22 women who claimed that the company’s talcum powder products caused ovarian cancer. In the second case, a jury ordered chemical manufacturer Monsanto to pay US$289 million to a janitor who claimed that Round-Up (glyphosate) caused his terminal cancer.

The problem with both of these cases is that there is, at best, some weak, unrepeated scientific evidence that supports their claims. However, if you refrain from cherry-picking articles on PubMed, you’ll find that the vast majority of research either doesn’t support their claims or even shows that there are no links between talc or glyphosate and cancer.

Let’s take a look at the science in both of these cases, and then, let’s find out why courts and science are not necessarily compatible. And remember, this isn’t a recent problem – an American court once rejected evolution during the infamous Scopes Monkey Trial. So science has been skeptical of the involvement of courts and science for a very long time. Continue reading “Courts and science – talc and glyphosate probably do not cause cancer”

Ten thousand years of GMO foods – making inedible edible

Ten thousand years of GMO foods

One of the tropes of the anti-GMO movement is that nature does it better for food, a logical fallacy. In other words, they believe that our ancestors’ foods are somehow better than our GMO foods. Of course, this belies the fact that there are over ten thousand years of GMO foods – it’s really not something that showed up during the last century or so.

People seem to endow “nature” with a special status that is ridiculous. Evolution proceeds along a random process where environmental changes select for certain mutations over time (and yes, I’m oversimplifying the process), which is called natural selection. Moreover, there are random mutations that just occur that provide no benefit to the organism, although they might in the future because of some environmental change.

Nature has no goal. It has no guidance. It has no underlying value of good or evil. Unless you believe that some higher being controls it, and at that point, you’re a creationist, claiming that “nature” is better than the alternative is basically ridiculous.

So, we’re going to talk about how genetic modification has moved from the early days of waiting for a random, beneficial mutation to the modern world of genetic modification.

Continue reading “Ten thousand years of GMO foods – making inedible edible”

Solid GMO scientific consensus – based on real science

solid gmo scientific consensus

Over and over, I’ve read comments on the internet (obviously, my first mistake) that there is no GMO scientific consensus regarding whether genetically modified organisms (generally crops or food) are safe for humans, animals, and the environment. Well, that’s simply not the case.

Furthermore, there are even claims that GMOs are not necessarily productive or provide higher yields, and so-called organic foods are healthier (they aren’t) and are better for the environment. Again, that’s not necessarily the case.

Let’s look at anthropogenic (human-caused) climate change since it also has this huge controversy over whether there’s a scientific consensus. Over 97% of published articles that expressed a conclusion about anthropogenic climate change endorsed human-caused global warming. If that were a vote, it would be a landslide that would make dictators jealous.

According to Skeptical Science, it’s even more than that:

We should also consider official scientific bodies and what they think about climate change. There are no national or major scientific institutions anywhere in the world that dispute the theory of anthropogenic climate change. Not one.

The consensus is so clear, outside of vocal, loud and junk science pushing individuals and organizations, that many scientists call it the “Theory of anthropogenic climate change,” which would mean it’s at the pinnacle of scientific principles, essentially an unassailable fact.

Continue reading “Solid GMO scientific consensus – based on real science”

MSG myth – debunked with real science

msg myth

Food additives are one of the most passionate issues amongst people who eat (which would be everyone). AspartameHigh fructose corn syrup. GMO‘s. Salt. Sugar. Trans fats. Polysorbate 80. But the MSG myth is one of the most pervasive in the food pseudoscience world (yes, I’m going to make that a thing).

Of course, these additives cause angst in people because of their scary chemical names. Or nonsense on the internet. Or random neurons firing.

Obviously, there is stuff, created by the beauty of natural sunlight and goddess blessed sweet waters from the Alps, that is better than these man-made evil chemicals. Well, no. Everything in nature is made up of “chemistry” –  25-hydroxyergocalciferol is a scary chemical name, right? Except it’s the metabolic product of the conversion of vitamin D in the human liver. It’s natural!

But let’s get back to MSG – how many times have you seen “No MSG” in a sign Chinese restaurant? Is it because China, who has been using MSG in their cuisine for centuries, has been conspiring against Americans since the first Chinese restaurant starting serving up kung pao chicken to unaware Americans?

It’s time to look at the MSG myth – is it real, or does it need a good debunking?

Continue reading “MSG myth – debunked with real science”

“Bad Advice” by Paul Offit – a book review by Dorit Rubinstein Reiss

Bad Advice

A new book, “Bad Advice: Or Why Celebrities, Politicians, and Activists Aren’t Your Best Source of Health Informationby Dr. Paul Offit, is different from his previous writings in two ways – much of it is autobiographical, with a lot of personal anecdotes, and it is about science communication rather than the actual science.

“Bad Advice” opens with a story of a 1997 TV interview Dr. Offit has, and how he bungled – by his account – a question about which vaccines children get, how many, and when. The story sets the tone for the book – it’s funny, it’s candid about what Dr. Offit did, in his view, wrong, and it offers sound advice for other science communicators.

To a large extent, this book was written for those engaged in science communication, and it is full of tips that can help every current or would-be science communicator.

What gives the book its charms are the anecdotes and the humor sprinkled throughout it, and its accessible and conversational tone, but I don’t think I can mirror that here without spoiling the effect – I think these are best enjoyed in context. So this review describes the content but does not capture what makes “Bad Advice” so much fun.

For full disclosure, I highly admire Dr. Offit, have sought his advice and help on many issues in my writing on and advocacy related to vaccines, and consider him a personal friend. I have also read a draft of the book and provided comments. 

Why Science Communication?

The first three chapters of “Bad Advice” provide important background by explaining why science communication is needed, and some of the obstacles to it. 

The first two chapters of the book set out what science is and what scientists do, and why their training and background make it difficult for them to be effective science communicators. Among the things covered – again, with a lot of humor, humility, and personal anecdotes – are that much of the scientific work is done alone, and much of what it requires makes people less, rather than more, suited to work with people. 

Dr. Offit discusses the fact that the scientific method trains scientists away from using absolute statements, but qualified statements can backfire when communicating about science; the challenge of reducing complex, nuanced reality into sound bites that work in a digital age; and more.

The next chapter analyzes why we need science communication, why people – however smart – may fall for misinformation. It looks at several natural, human features that make us easily wrong on scientific issues. “Bad Advice” also examines our difficulty identifying and assessing risks, the pull of celebrities as authority figures, even though they may not have the background to provide good information, and may, in fact, promote bad information (for example, Robert F. Kennedy Jr.  – revisited later in the book – constantly provides bad information about vaccines  ). The chapter also talks about other limits on the ability of humans to think rationally and the ways we acquire knowledge.

After thus setting the stage for why it’s important to engage in science communication and some of the challenges, Dr. Offit is ready for the next stage.

Good advice vs bad advice

Chapters 4 through 7 offer direct advice on communications through personal anecdotes of things that worked and things that didn’t in Dr. Offit’s over 20 years of doing it.

In chapter 4, Dr. Offit offers “some painful, hard-earned, and occasionally humorous lessons gleaned from personal experience” on communicating with the public. These range from the deeply practical (“be comfortable”) to the content based (“be sympathetic,” in the context of an eleven-year-old diagnosed with AIDS at the time when HIV was a death sentence, and “Don’t panic.

The facts are your safety net.”). But they’re invariably written as amusing anecdotes leading to a useful punchline. In one of the stories, Dr. Offit describes how he arrived at the famous “10,000 vaccines” quote that anti-vaccine activists like to misuse. The punchline? “You are going to say things that, although scientifically accurate, you will regret. It’s unavoidable.”

Chapter five addresses whether it’s appropriate for scientists to debate science deniers, using several examples. Dr. Offit’s recommendation is to avoid it, but he does provide three successful examples of such debates. His conclusion is that he, personally, is too angry and passionate on vaccine issues to successfully participate – because he annually sees children die from preventable diseases, “invariably, .. because parents have chosen not to vaccinate their children. And the reason they had made that choice was that they had read or heard bad information..”

Bad Advice ends with a recommendation that “debating the undebatable is worthwhile,” if, and only if, scientists can see the discussion as a teachable moment, and not focus on the people they are debating or the others in the room.

I’m not sure I agree, at least in terms of a televised debate. I think Dr. David Gorski said it well when he wrote:

…debating cranks doesn’t sway anyone, sharing the stage with a real scientist does unduly elevate the crank in the eyes of the public. Besides, whatever the seeming outcome of the debate, you can count on the crank to declare victory and his believers to agree. In any event, science isn’t decided by the metrics used to judge who “wins” a public debate, which rely more on rhetoric and cleverness rather than science to decide the outcome. Finally, such debates are not without risks. Although Julian Whitaker, for example, was terrible at it, other cranks are adept at the Gish Gallop, and an unprepared skeptic or scientist can be made to appear clueless in front of a crowd that is almost always packed with supporters of the crank, not the skeptic.

I think I agree with Dr. Offit’s initial position that agreeing to a debate is a bad idea.

Chapter six looks at the role of comedians in combating misinformation about science, focusing on vaccines – covering the Penn and Teller episode, Jimmy Kimmel, the Daily Show and the Colbert Report. And I’m really going to let you read that by yourselves. It’s fun.

Chapter seven looks at the ways the cinema helps or harms science communication. It opens by comparing two films about outbreaks – “Contagion,” that got the science right, and “Outbreak,” that did not. To give a flavor, when talking about “Outbreak,” Dr. Offit describes how a monkey carrying the harmful virus was caught, and the movie scientists had to “determine which antibodies are neutralizing the mutant virus, synthesize those antibodies, and make several liters of life-saving antisera. Assuming everything goes well, Hoffman’s task should take about a year. Cuba Gooding Jr. does it in a little less than a minute. (Now I understand why people are angry that we still don’t have an AIDS vaccine.).”

Nonetheless, Dr. Offit sees an important role for movies in science communications, and urge scientists to work with filmmakers to get it right.

Science communication in action – confronting the anti-vaccine movement:

The last part of the book uses the anti-vaccine movement as a story of the pitfalls and successes of science communication.

Chapter 8 of “Bad Advice” looks at how charismatic figures can promote anti-science misinformation. Although it covers several examples, the heart of the chapter is the case of Andrew Wakefield, the British doctor who promoted misinformation about MMR. Dr. Offit tells the dramatic story of Wakefield’s rise, the scientific literature that showed him wrong, and the discovery of his misdeeds, that led to his fall. He describes Wakefield’s situation today – thoroughly discredited, on par with other conspiracy theorists – through his participation in the infamous Conspirasea Cruise.  The end of the chapter examines different explanations for why Wakefield sticks to his original claims, years after they’ve been thoroughly disproven. I’ll let you find out yourselves. It’s not exactly flattering to Wakefield, though. 

Chapter 9 looks at the role of politicians in promoting anti-science misinformation, focusing on Dan Burton’s hearings that tried to make a case that vaccines cause autism (YouTube snippets of the hearings, out of context, are still used by anti-vaccine activists. Dr. Offit will give you a more comprehensive view). Dr. Offit also tells of his own experience in the hearing, and what he sees as errors committed because of his naiveté and inexperience. It’s half sad and half comical to read through both his preparation for the hearing, and the actual experience of Mr. Burton, who came into the hearing with a set conclusion and a set role he wanted Dr. Offit to play, trying to delegitimize him. 

Chapter 10 warns science communicators to expect a campaign of personal delegitimization and attacks, drawing on Dr. Offit’s own extensive experiences with anti-vaccine efforts to attack him. It goes from hateful emails, through lawsuits, to death threats. It’s painful but incredibly important for people who go into these areas to be prepared for the ugly reaction from misguided but passionate people on the other side, in all its extreme forms. 

Chapter 11 goes more deeply into Dr. Offit’s own reasons for entering the fray. It is very autobiographical (some of the events in it were described in some of Dr. Offit’s other books, but many will be new to readers), telling his career story – again, with lots of humor, more than a few lumps. This is to explain what motivates him to speak up, and to some degree, to counter the claims accusing him of having a conflict of interests because of his involvement in the creation of the rotavirus vaccine. It’s a powerful chapter.

Chapter 12 ends on an optimistic note, pointing out things that have improved in the war for science – the rise of science bloggers, the better attitude of the media. And in the epilogue, Dr. Offit ends with the March of Science, as an embodiment of the willingness of science supporters to fight back.

Takeaway

In this very autobiographical, often humorous, extremely candid and full of good advice book, Dr. Offit does a service to science communicators by telling them what worked, what didn’t, and some thoughts on what comes next. You may not always agree with his advice, but you are very likely to agree with large parts of it, think about much of it, and enjoy the way it’s delivered. It’s a very fast read, and worth reading and probably rereading. And rereading.





Please help me out by Tweeting out this article or posting it to your favorite Facebook group.

There are two ways you can help support this blog. First, you can use Patreon by clicking on the link below. It allows you to set up a monthly donation, which will go a long way to supporting the Skeptical Raptor
Become a Patron!


Finally, you can also purchase anything on Amazon, and a small portion of each purchase goes to this website. Just click below, and shop for everything.




Vaccine research – it doesn’t mean what the anti-vaxxers think it means

vaccine research

How many times have you read a comment from an anti-vaccine zealot along the lines of “do your research, vaccines are bad.” That comment seems to imply two things – that the anti-vaxxer believes they have done real vaccine research, and those on the science/medicine side have not done real vaccine research.

Typical of nearly every claim made by the anti-vaccine religion, this is another one where they understate how hard vaccine research really is while overstating their actual skills and experience in comprehending real scientific research. I suppose this is a perfect example of the Dunning-Kruger effect – a cognitive bias wherein people without a strong scientific background fail to recognize their actual ineptitude in the field and mistakenly overrate their knowledge and abilities as greater than it is.

On the other hand, I’ve done real scientific research and worked hard at it. Time to explain. Continue reading “Vaccine research – it doesn’t mean what the anti-vaxxers think it means”

How to prevent cancer in 12 easy steps – vaccines are critically important

how to prevent cancer

I have railed against pseudoscientific charlatans who claim that they have the easy way to prevent or cure cancer. Generally, these snake oil salesmen try to convince you that they have some miraculous food, supplement, spiritual energy, and on and on, that can either kill cancer in its tracks or keep them from even growing in your body. Of course, none of their claims are actually supported by robust science. On the other hand, real science has 12 evidence-based methods to actually prevent cancer.

But what about those memes that say that supplements prevent cancer? Nope, they don’t. And that’s been shown in study after study after study after study (yeah, I could go on for awhile).

What about avoiding GMO foods because they cause cancer? Again, studies show that GMO foods have no effect on cancers. Oh, one more thing – bananas don’t have tumor necrosis factor, and the yellow fruit can’t prevent or cure cancer (but that doesn’t mean that they aren’t delicious).

Despite the absolute lack of evidence that supplements, kale, bananas, or drinking the pure waters of a glacial fed stream (which may not be an option with climate change), there are only a few things that can be done to manage your overall risk of cancer.

How to prevent cancer has been codified by the World Health Organization’s  (WHO) International Agency for Research on Cancer (IARC) into 12 steps (no, not that debunked one) that are called the European Code Against Cancer.

Let’s look at cancer and how to prevent cancer.

Continue reading “How to prevent cancer in 12 easy steps – vaccines are critically important”

Hierarchy of scientific evidence – keys to scientific skepticism

hierarchy of scientific evidence

I am a scientific skeptic. It means that I pursue published scientific evidence to support or refute a scientific or medical principle. I am not a cynic, often conflated with skepticism. I don’t have an opinion about these ideas. Scientific skepticism depends on the quality and quantity of evidence that supports a scientific idea. And examining the hierarchy of scientific evidence can be helpful in deciding what is good data and what is bad. What can be used to form a conclusion, and what is useless.

That’s how science is done. And I use the hierarchy of scientific evidence to weigh the quality along with the quantity of evidence in reaching a conclusion. I am generally offended by those who push pseudoscience – they generally try to find evidence that supports their predetermined beliefs. That’s not science, that’s the opposite of good science.

Unfortunately, today’s world of instant news, with memes and 140 character analyses flying across social media, can be overwhelming. Sometimes we create an internal false balance,  assuming that headlines (often written to be clickbait) on one side are somehow equivalent to another side. So, we think there’s a scientific debate when there isn’t one.

I attempt to write detailed, thoughtful and nuanced articles about scientific ideas. I know they can be complex and long-winded, but I also know science is hard. It’s difficult. Sorry about that, but if it were so easy, everyone on the internet would be doing science. Unfortunately, there are too many people writing on the internet who think they are talking science, but they fail to differentiate between good and bad evidence.

But there is a way to make this easier. Not easy, just easier. This is my guide to amateur (and if I do a good job, professional) method to evaluating scientific research quality across the internet.

Continue reading “Hierarchy of scientific evidence – keys to scientific skepticism”

Science mistakes – debunking a trope loved by pseudoscience

science mistakes

I know I shouldn’t use the conspiracy theory fallacy when talking about the pseudoscience-pushing science deniers, who provide bread and butter of topics for skeptics. I keep observing the same ridiculous and insanely illogical arguments used in the same manner by all of the deniers, including the oft-repeated “science mistakes” trope. Honestly, I think the pseudoscience pusher meet annually in Sedona, Arizona, ground zero of woo, to discuss which trope they’re pushing this year.

The anti-vaccine zealots, creationists, anthropogenic global warming deniers, and whomever else pretends to use science to actually deny science frequently focus on this theme of “science mistakes.”  And then they produce a list of cherry-picked examples that “prove” that science is wrong (see Note 1). Of course, this indicates more of a misunderstanding of what is science and the history of science than it is a condemnation of science. But your typical science denier is probably not going to let facts get in the way of maintaining faith in their beliefs. So let’s deconstruct and discredit this “science mistakes” trope.

By the way, in my story, I admit that there are many “science mistakes,” so read on. Hopefully, it’s somewhat enlightening. Continue reading “Science mistakes – debunking a trope loved by pseudoscience”

Fake science about Star Trek accepted by predatory journals – anti-vaccine researchers happy

fake science

One of my pet peeves, of which there are many, is when a fake science paper is published by a low ranked journal and trumpeted as if it is Nobel Prize-worthy research. You can read about anti-vaccine fake science published in these journals from notorious anti-vaccine “researchers” like Shaw and Tomljenovic, Exley, and Shoenfeld.

One of my pet loves is Star Trek, all versions, all the time. In fact, I occasionally have secret conversations with my fellow Big Pharma shills about Star Trek, in which vaccines are never mentioned. I am a self-confessed Star Trek Nerd, who has watched almost every episode of Star Trek ST: TOS through the current Star Trek: Discovery (see Note 1).

So when I get the opportunity, falling into my lap, to combine Star Trek and the anti-vaccine nonsense, I am happier than a pregnant tribble. And when a fake science paper about the Star Trek universe gets accepted by low ranked predatory journals, ones that are beloved by pseudoscience adherents across the world, it’s what I live for. Continue reading “Fake science about Star Trek accepted by predatory journals – anti-vaccine researchers happy”