November 21, 2014 Fringe, Podcast No Comments

Episode 53

1x
0:00
00:25:22
[transcript]
In a slight departure from regular Smart Drug Smarts convention, in Episode #53, we take a look at “Snake Oil” – the term used to classify fraudulent medicine, or products that claim unverified results.

Jesse speaks to Drew Birnie, Research Scientist and Content Manager at MarkManson.net, a website dedicated to providing – in the words of its founder – “self-help for people who hate self-help.” Drew’s job is to comb through research material in a scientific and data-driven way, in order to separate empirical evidence from pseudoscience. He explains confirmation and survival biases, takes us through his research methods, and highlights for us the value of being proven wrong.

Episode Highlights

0:32What is "snake oil?"
2:33This Week in Neuroscience: Neuroscientists And The Computer
4:47Thank-you to Catrine Bergeron
5:26Big Data in Biomedicine Conference
5:54Come say "hi!' on Twitter
6:25Mark Manson – "Self-help for people who hate self-help"
7:34Introduction to Drew Birnie
9:29Challenges and audience expectiations
10:15Confirmation Bias: A cognitive pitfall that snake oil salesmen take advantage of
11:19Survival Bias: Selectively publishing results
13:19Drew's claim assessment process
14:49The power of the placebo effect, and double-blind testing
16:41Jesse's personal experience with Piracetam, and accounting for individual variation
17:52Adopting an empirical method: "Letting go of always having to be right, and realizing the value of being proven wrong"
19:03Accepting falling into the snake oil trap as a trade-off for the temporary spike of hope
20:00Snake Oil - Part 2: Our first-ever video episode teaser
20:47Ruthless Listener-Retention Gimmick: Even Casual Marijuana Smokers at Risk for Structural Brain Changes
23:58Thanksgiving-themed episode teaser

Key Terms Mentioned

Links From Drew

Episode Transcript hideshow

**Voice-over:** *I try to imagine a fellow smarter than myself, then I try to think - What would he do?* **Announcer:** *Charge up your axons, ready your receptors and shift your lobes in to upper beta phase. You're listening to Smart Drug Smarts - the podcast dedicated to helping you optimize your brain with the latest breakthroughs in neuroscience, nootropics and psychopharmocology.* **Jesse:** Hello and welcome to Smart Drug Smarts, I'm your host Jesse Lawler excited to bring you the 53rd episode in this podcast, dedicated to the improvement of your own brain by any and all means at your disposal. This week we're going to do a little bit of a throwback to a movie that you may remember, featuring Samuel L. Jackson and a cast of snakes. "I've had it with these motherfucking snakes on this motherfucking brain." Well, Samuel L. Jackson will not actually appear in this episode but I thought that that might be a great way of introducing the topic of snake oil which is something I have kind of wanted to cover for a while. Snake oil, for those of you who might not be familiar with the term, is kind of the derogatory term that we give to people who are selling medicine that is in fact just quackery and nonsense, sort of playing on people's hopes to get them to spend money on something without any scientific credibility behind it. This is clearly a long-standing tradition of the bad kind in the world of personal health and the classic vision of the snake oil salesman is this punkster travelling the Wild West with a horse-drawn carriage full of snake oil which is promised to cure all ills, be a complete panacea and, of course, he sells as much as he possibly can to the profoundly naive citizenry of the town and tries to hoof it out of town to the next spot, to take advantage of the next people before the people he just sold a bunch of bogus merchandise to realize that he was up to no good and chase him around with torches and pitchforks. Of course, the modern day version of that would not be a guy selling snake oil at all but would be more like a shady online retailer making lots of claims about a product based on either bad science or a complete lack of science, sort of playing on people's hopes and getting them to surrender their credit card information for something which is probably going to have no medical value whatsoever and we're going to be talking to a guy whose job is essentially to investigate claims for their scientific merit - not all of them personal health - but he is a professional researcher whose job is to sort the wheat from the chaff, decide what is worth giving actual authentic wait to. That is what he does professionally and he is going to talk to us about how he goes about that process. If you hang around until the end of the episode, I am going to tell you what is going on in our longstanding quest to have a Smart Drug Smarts marijuana episode and some other news about marijuana. But, before we get into any of that, let's do This Week in Neuroscience. **Voice-over:** *Smart Drug Smarts - This week in Neuroscience!* **Jesse:** So, a couple of weeks ago, in fact, this was published on Halloween of this year, there was an opinion piece in the Frontiers of Neuroscience written by Joshua W. Brown who is from the Psychological and Brain Sciences Department of Indian University and I thought this was a super-interesting read. The article's title is "The Tale of the Neuroscientist and the Computer: Why Mechanistic Theory Matters" and Dr. Brown is making the point that despite the fact that we're obviously in the Golden Age of brain data capture with reams and reams and reams more information coming in extremely fine-grain detail about what is going on at a cellular and sub-cellular level within the brain, well beyond what we've been able to capture previously, that we're still lacking an overall mechanistic model of how the brain actually works and that while we've got a heck of a lot of tree-level data, we might be missing the forest. The article is essentially an analogy, which I won't try to do full justice to here, between how scientists might go about trying to figure out how a computer worked if they were on a world that had a working computer but no computer engineers or how to make a computer diagrams and they're trying to reverse engineer it in a way similar to what we're doing now with our attempts to understand the brain. He says, "Within the fields of systems, cognitive, and behavioral neuroscience in particular, I fear we are in danger of losing the meaning of the Question “How does it work?” As the saying goes, if you have a hammer, everything starts to look like a nail. Many researchers are especially well-trained in psychology, and so the research questions focus predominantly on understanding which brain regions carry out which psychological or cognitive functions, following the established paradigms of psychological research. This has resulted in the question often being re-framed as “What brain regions are active during what psychological processes,” instead of “What mechanisms are necessary to reproduce the essential cognitive functions and activity patterns in the system.” Obviously, it is a pretty heady article. This isn't the kind of light fluffy reading they publish in Newsweek or something like that but I found it really interesting. It definitely shed some light on what could be an endemic problem in our approaches to understanding the brain as well as offering some ideas on how we can alleviate those problems by training neuroscientists to be very aware of Systems Theory and modelling in areas that sometimes they are not schooled in. **Voice-over:** *Smart Drugs Smarts.* **Jesse:** By the way, I would like to circle back to an earlier episode that we did in This Week in Neuroscience about Dr. Alan Langers experiments and that was based on an email I got from listener Katrine Burgerun who I should have thanked on that episode but I kind of lost that email in the shuffle but I wanted to officially extend a big thank you. I should also mention that we've got a Suggestion Box page on www.smartdrugsmarts.com, so if you come across any neuroscience related articles or other things that you think would be cool for us to cover on the show, episode recommendations, stuff like that. It turns out that there is a heck of a lot of neuroscience happening all the time and having the eyes and ears of the listenership helping us know what is going on is certainly a big help for us. I'd also like to give a hat-tip to my friend Avi Roy over at Oxford who is putting together the big DataMed Conference. They are working there on what Roy calls 'True Preventative Medicine' and trying to fight the diseases of old age. I guess you are never too young to be fascinated by the idea of extreme longevity and Roy is not that old of a guy but he's already pretty much stapled his life's work to doing everything he can to push forward longevity and sustained health throughout a person's entire lifespan. For those of you who are Twitter people, if you are interested in following Roy, he is @agingroy. I should probably talk more about Twitter here because I am becoming a pretty big Twitter geek myself. I am not on it quite as religiously as some people are, but my Twitter handle, my personal one for those of you who are interested, is @lawlerpalooza. Of course, we've also got a Smart Drug Smarts Twitter handle which is @smartdrugsmarts. **Jesse:** So, as mentioned, snake oil is the topic du jour on this episode. I've been wanting to do a snake oil episode for a while but haven't really had the right way to approach it but, about a month and a half ago, something happened. I am on an online forum with a guy who I've been lucky enough to get to know over the course of the past year or two. His name is Mark Manson. Mark is a professional author, blogger, entrepreneur - he has a very popular website called www.markmanson.net where he's basically spewing his views on anything and everything and he's got views on almost anything and everything. He describes what he does as "Self help for people who hate self help." He mentioned that he has a researcher who digs up the scientific credibility of different things because, obviously, in the self-help space there are all sorts of claims on "Doing X will revolutionize your life" - that kind of thing. He is very careful about fact-checking what he puts out there and so he wants to know that anything he is purporting to be true is as closely vetted as it possibly can be. So, when it came to my attention that I was only one person removed from a professional scientific credibility checker, I dropped an email to Mark and I was like, "Hey! Can I talk to your guy?" And he was like, "Ya. Here's my guy's email." The guy is Drew Birnie who we are about to talk to now. **Voice-over:** *Smart Drugs Smarts.* **Jesse:** With no further adieu if you could give me a little bit of your background and how it is that you have become an expert on this sort of research? **Drew Birnie:** Yeah sure. Well tell you where I am right now, I'm working for Mark Manson. I came to work for him where I am his content manager, I guess you could say. I help him out with research, we do a lot of research on the articles he writes and he really wants to take a data driven and scientific approach to it. So having somebody on hand to help him out with that has been kind of a priority of his for a little while. And I got that job back in May. Before that I got my Masters at the University of Nebraska in neuroscience and behavior. After that I was a research scientist, staff scientist I guess at the University of Nebraska there. Did all sorts of research. Done research on humans, did a lot of research on monkeys, done rodent studies. I taught a little bit there too. I taught research methods to University students there. So there was a lot of evaluating the good and the bad science, kind of separating the wheat from the chaff, if you will. And teaching that to University students really gave me a firm handle on what problems people kind of have when they're evaluating research or evaluating claims that are made. After that I went to the University of Iowa, worked for my PhD for a little while there and that's kind of when this opportunity with Mark came up. So that's on hiatus right now, my PhD while I work for Mark and kind of have a lot of fun and hang out in the mountains right now, which is a lot of fun too. **Jesse:** So a day in the life for you now is really internet research based? **Drew Birnie:** Yeah that's right. You know like I said Mark really wants to take this kind of scientific data approach to software development. That's a lot of work to make sure that what you're saying isn't completely bogus. And so kind of a day in my life, I do a lot, I read a lot of research. That's the main part of my job. That's kind of my background on what I do and how I've come about in the science world I guess. **Jesse:** Everybody who is reading about a self-improvement thing wants to think, this applies to me, this can change my life, this is going to help me. You've sort of got a audience of people that are automatically in a very hopeful mindset, which is exactly the kind of people that might play in to the hands of somebody selling them something dubious. **Drew Birnie:** Absolutely absolutely. Like I said, if you take a data driven approach to that, there's all sorts of nuance, there's all sorts of challenges that we face everyday. People coming in with these, you know they have this high expectations and I think part of what we do anyway is we try to make people look at things realistically, which I think a big marketing tactic of snake oil salesmen is to over-promise and get those peoples hopes up. Unfortunately they often under-deliver. **Jesse:** What are some of the cognitive pitfalls that we all have as humans that the snake oil salesmen are taking advantage of? **Drew Birnie:** First of all, like you just mentioned with a hopeful mindset that we come in to our self-improvement or our approach to self-improvement, with that kind of comes the confirmation bias. Simply put it's just kind of when we only seek out and or validate evidence that confirms our pre-existing beliefs, while we simultaneously will disregard evidence that contradicts our pre-existing beliefs. For example, say you have a nootropic of some sort that you are really excited about and you tell somebody about it and they're kind of skeptical. So what you might go do is find a whole bunch of research that supports your view and you're going to ignore a whole bunch of research that contradicts that as well. Snake oil scientists will take advantage of this, in that they know that you're out there looking for just to support whether it's an improvement in your cognitive performance or something like that. They know you're hopeful and they want you to kind of sink into that I guess. **Jesse:** So we talked a little about confirmation bias there I think was the first one. And then also let's talk about survivor bias because that's a really interesting one that I think is starting to become more of something that's in public consciousness, but a lot of people maybe haven't heard of that one yet. **Drew Birnie:** Essentially what it is specially the way that scientific and academic research is set up right now, you're more likely to publish results that have shown significant findings. And when I say significant I mean statistically significant findings or even just interesting findings. Where as journals are more reluctant to publish results that show no effects. Classic example would be something like a nootropic or even just a clinical drug trial in which they give a drug and they don't find anything in say 5 trials, but there are 3 trials out there where they did find something. **Jesse:** Seems like in an academic setting it should be sort of a free-for-all with information that's found in one University vs another but I can see for privately funded research by industry, they might maybe want to protect their own data sets. But it seems weird that in an academic setting there's still not more open availability. **Drew Birnie:** Again there's ideals and there's practice. Scientists even academics and Universities they are human too and they're going to be protective of what they see as their work. To some degree it is, in a way it's their intellectual property I guess. But at the same time too it's in the spirit of academia and open communication. There are glimmers - there's kind of an open source network - there's this website that's looking at both psychological research and cancer research as well, and they've made an attempt to go back and replicate a lot of the studies that have come out in the past 10 to 15 years. As well as make all of that data available for anybody else to evaluate, which is a fantastic idea. They give out some small grants as well. So there is this kind of burgeoning push towards little bit more of what I would call fair system. **Jesse:** Let's go back to how marketers take advantage of some of these holes in the data collection world. When you're assessing a claim written, published on the web whatever it is, what's your process like? **Drew Birnie:** The first thing I do is I go straight to the methods section of a scientific publication or a research publication. This is usually the last thing people read or they don't read it at all, which is a fatal mistake. I want to know exactly what they did before I'm going to spend anymore time learning about what they think is their insight to the theory on whatever they're studying here. In my book, the book of empirical thought, methods always trump results. The other thing I do though too, after I've figured out exactly what they did and have a good understanding of that, I look at their results and I make my own assessments. I come to my own conclusions, my own interpretation before I read what they say. Because sometimes I can come up with a different interpretation from the same set of data that they have. And what's interesting is when you do have these discrepancies is to see what the reasoning is behind it. Often times you'll uncover a different line of logic that they used, which you can further pick apart their arguments or pick apart your own. **Jesse:** How frequent is that, that you actually come out with different analysis of the same data? **Drew Birnie:** It's usually a combination of their results and methods that makes me think, I don't know if I agree with that because of the way they collected this data or collected thios result that they had. **Jesse:** Obviously a lot of this is going to touch on the placebo effect. The placebo effect seems like it's finally getting the respect it's due. I'm seeing more and more articles about just how powerful it is. I did a placebo episode maybe about 6 months ago and the thing that I've been thinking about placebo-wise ever since then is that the placebo effect happens even when there are actual results as well. So it's like another layer on top of things that really can happen. If you take a pill, feel a result then your expectation of feeling that result, even though the pill might actually be doing something, it kind of turns up the dial even a little bit further. So the fact that there can be placebos in addition to actual results is kind of a mind-bender for me at first. **Drew Birnie:** Oh yeah. So the placebo effect goes back to methodology, specially in human research. This is why the randomized double-blind placebo controlled experiment is kind of the gold standard. You randomize your subjects so that they have an equal chance of being either in the treatment group or the control group. So that way you get groups that are very similar to each other, if they're randomly assigned. You're not favoring one type of person in one group or another. But then the double-blind you know in theory will take care of that placebo effect. When I'm evaluating research not just with nootropics or anything like that, but any kind of research where there is an intervention - is there that double-blind aspect to it - where the person who is receiving the treatment or the experimental manipulation; are they aware of which treatment they're receiving? Do they know if they're in the control group or not? And almost maybe more importantly, does the investigator know that? Does the investigator know which group they've been assigned to because they can bias the results that way. More one can be aware of the effect that a drug might have on them, the more they're susceptible to the placebo effect. So that's a big red flag if you don't see that. **Jesse:** Let me ask this though because this is something that goes against what my intuition was on the placebo effect, like my own personal experience with Piracetam. Piracetam is a drug that I had great expectations for based on both things that I had read from other people's subjective experiences and the fact that there are study after study showing it having neuroprotective and in some cases cognitive enhancement that one can feel benefits. So I had nothing but high expectations going in for Piracetam and I never could feel it. It doesn't mean it wasn't doing anything inside my brain but there was no subjective effect whatsoever and if anything I would've been completely primed to have a strong positive placebo response, which I didn't at all. What's going on in a case like that? **Drew Birnie:** Here you're bringing up a study of one. It could be anything. This is why we do trials on groups of people because there is so much variation and there's variation to the placebo effect too. Some people are more susceptible to the placebo effect than other people. Maybe something like where you're sitting here thinking, "Oh this effect will be strong. I really do believe in this." That in some way or another ended up letting you down by maybe setting your expectations too high. There is one other point I kind of wanted to make too about evaluating research and the mindset that you go into it with. As empiricists as people who value empirical thought, we at least pay lip-service to the value of falsification. The part of adopting this empirical mentality is letting go of always having to be right and realizing the value of being proven wrong. So we can gain so much more information and value from an experiment or even just an experience in everyday life by being proven wrong than we can by continually just confirming what we believe. To put it this way think about the last time you realized that you were wrong about someone or something or even about yourself and kind of the profound effect that had on you. When you're wrong you have this perspective shift, right? You have a new frame of mind, you learn so much about yourself or about this other person or about this situation, than you do when you go through your life continually confirming things that you already believe. So this goes back to the confirmation bias as well. **Jesse:** I think just the nature of our species is that there's always going to be snake oil. There's always people that probably deep in their heart kind of recognize that they are accepting bullshit but they want the temporary spike of hope and they're willing to pay money for that little jolt of hopefulness, even if the logical part of their brain knows it's bullshit. You give people what you can, you give them the tools to do the critical thinking and what they choose to do with it at that point is sort of on them. **Drew Birnie:** Yeah and I think that's where we are at right now. We're in that kind of phase where teaching people to think critically. Because again like I said, not everbody has the time to evaluate every single little claim that comes out, everything that they put in their body, they don't have time to spend weeks researching. But there are things they can do to think critically about and do just the general assessment right away and at this point that's the best we can do. **Voice-over:** *Smart Drugs Smarts.* **Jesse:** So, thank you very much to Drew for joining us for that interview. This was a little bit of a short interview but this is actually intended to be probably the first of a two part interview series. We are thinking about doing the first ever video episode of Smart Drug Smarts and actually following Drew as he does his online fact-checking on a real website, so we'll find something for him to do an investigation on and kind of follow him screen-by-screen as he goes through his process. We thought that would be really interesting to do but there are some technical considerations that frankly, we were outmatched by on our initial phone call. So we decided that we would do a regular audio interview which you just heard. But, probably sometime early in the new year, we'll do a video tutorial of what Drew's fact-checking process looks like. But now, the Ruthless Listener Retention Gimmick. **Voice-over:** *Smart Drug Smarts - Ruthless Listener Retention Gimmick!!!* **Jesse:** So, I got an email from a friend a number of months ago asking me to do a marijuana episode and in his email he made a very compelling case on the use of marijuana for creativity enhancement. It is, obviously, not a smart drug. I don't think anybody is going to make that argument but he did actually make a pretty compelling argument for it's usefulness for something other than pure recreational or pain relief reasons which are the two things it commonly gets used for. So, ever since then, (this friend will remain nameless for obvious reasons but he knows who he is and I appreciate it buddy,) I've been trying to put together a Smart Drug Smarts marijuana episode but a funny thing happened on the way to the marijuana doctor. We've had a scheduled interview with a particular scientist studying marijuana that has been cancelled 3 or 4 times now and for those of you who have any familiarity with dealing with pot people, having things get cancelled and the timing doesn't quite work, that is kind of the continuing story of the world of marijuana and I just think that is so damned funny that even trying to put together a scientific interview on the subject seems to be getting thwarted by very pot-head like problems. Let me put the very very large asterisk on it that when we finally do have a Smart Drug Smarts marijuana episode, the doctor who I finally get to talk with may or may not be the same doctor who I've been trying so far. So let this not be pre-disparagement of a future guest. I wouldn't want anyone to take that the wrong way, but I did think that was kind of funny and maybe you will as well. For this Ruthless Listener Retention Gimmick though, I wanted to bring your attention to an article published in April of this past year which cites a study published in the Journal of Neuroscience showing that even casual marijuana smokers are at risk for structural brain changes. It has long been pretty well understood that heavy marijuana users do have structural changes in their brains which you would not necessarily want, as a result of heavy pot use but apparently this is sort of a sliding scale where even a little bit is doing something and obviously that scales up. Researchers from Northwestern University, Harvard Medical School and Massachusetts General Hospital used MRI scans to compare the brains of 20 18-25 year olds who smoked at least once a week to those of 20 individuals who had little to no history of marijuana use. Though the low to moderate users were not addicted to the drug, imaging data showed that the brain anatomy had nonetheless changed. Said Anne Blood, Ph.D. (great name for a doctor, by the way), "While I don't think anyone has directly contrasted recreational with dependent users, it is pretty clear from our data that the more you use, the more the brain is impacted." The nucleus accumbens, a part of the brain linked to reward processing, was larger and had changed shape in casual smokers compared to non-users. The researchers also found that the amygdala, which helps to regulate emotions, had also changed in shape and density in those who smoked the drug. "These are not brain regions that you want to alter," the doctor said. "There is no doubt that we will find these structural changes have some impact on these individuals' neurological or psychiatric and behavioral function.” According to the National Survey on Drug Use and Mental Health (NSDUH), 18.9 million Americans say they've recently used marijuana. **Voice-over:** *Smart Drug Smarts - The podcast so smart, we have smart in our title, twice!!* **Jesse:** You heard it. That is the episode. If you liked what you heard please recommend this podcast to your friends and or leave us a review on iTunes. I'd like to give a little backslapping round of approval to the Smart Drug Smarts team for having got this episode out in less than 1 week since the publication of our last episode and I think we are actually going to repeat that. It's a few days before Thanksgiving at this point and we've got a Thanksgiving-themed episode coming up for Episode 54. We're going to try and have that out by Tuesday of Thanksgiving week so you can brace yourself before sitting down at the Thanksgiving dinner table or afternoon lunch table. I don't know what Thanksgiving's deal is, how it's always calling it dinner and eating it at 1 'o' clock in the afternoon. It's just never made any sense to me. But, whatever, you and your family do, we'll be there for you here at Smart Drug Smarts. The show notes for this episode will be online at [www.smartdrugsmarts.com](http://smartdrugsmarts.com//) including the links to everything that we talked about here. I will be back at you, as mentioned, earlier than usual, on this very same podcast with the same unflagging commitment to helping you fine-tune the performance of your own brain. Have a great week and stay smart. **Announcer:** *You've been listening to the Smart Drugs Smart podcast. Visit us online at [www.smartdrugsmarts.com](http:/smartdrugsmarts.com//) and subscribe to our mailing list to keep your neurons buzzing with the latest in brain optimization.* **Disclaimer:** *Smart Drug Smarts should be listened to for entertainment purposes only. Although some guests on the show are medical doctors, most are not and the host is just some random guy. Nothing you hear on the podcast or read on [www.smartdrugsmarts.com](http:/smartdrugsmarts.com//) should be considered medical advice. Consult your doctor, and use some damn common sense before doing anything that you think might have a lasting impact on your brain.*
Written by Rhiannan Roe
Rhiannan Roe is a writer, editor and unapologetic champion of self-improvement. Combining her passions has led to her helping several start-ups across three continents. In her spare time she travels, collects stories from inspiring people, and fruitlessly endeavors to read every book ever written.
Affiliate Disclosure
This website contains affiliate links, which means Jesse may receive a percentage of any product or service you purchase using the links in the articles or ads.  You will pay the same price for all products and services, and your purchase helps support Smart Drug Smarts' ongoing publications.  Thanks for your support!