Research outside the mainstream, psychoactive substances and taboo topics.
The world loves coffee.
To say “people love coffee” is a little like saying “people love sex.” In fact, going by the straight numbers — two billion cups of coffee drunk per day vs. slightly fewer fornications — people may like coffee more than sex.
Muslims, Christians, Jews, Atheists, Europeans, Africans, Americans, Flat-earthers, scientists, terrorists, and Red Cross volunteers… Coffee-drinking cuts across all social divisions and unites us all in one big human family.
Homo Sapiens caffeinophilus, you might call us.
Coffee is everywhere. The thought of caffeine-abstinence for the 90% of adults worldwide who partake daily is headache-inducing — both figuratively and often literally.
And so drastically reducing the world’s coffee supply — as unpopular as that would surely be — sounds less like reality than it does a bad plot device by a villain in a James Bond movie financed by Starbucks.
The coffee bean just doesn’t fit well with our preconceptions about endangered species.
But this grim prospect may be closer to reality than we’d like to think — and without a mustache-twirling mastermind as a convenient scapegoat. If we lose coffee, the perpetrator will be climate change.
And that means that ultimately, the villain will be us.
We all know the canary in a coal mine expression, right?
Just in case, here it goes: In the bad ol’ days, in deep underground mines, sometimes noxious gases would seep up and kill miners. Canaries breathe fast and are apparently more cardiovascularly fragile than your average coal miner. So in the days before high-tech early warning systems, a dead canary — assuming you’d had the presence of mind to enter your coal mine carrying a live one — was a helpful clue that you should get the hell out of there.
Canaries (and similar warning bells) work well in cases of acknowledged and immediate mortal peril.
But they don’t work so well in situations where the thing that kills you takes a while to do so. You’ve never noticed canaries smoking Marlboros or eating McDonald’s Extra Value Meals alongside consumers of those products, and there’s a good reason for this. It’s not because canaries probably wouldn’t die from these things, given enough exposure; it’s because they wouldn’t die fast enough to merit the annoyances of long-term bird ownership.
Canary systems are great when the terror groundwork has already been laid. For nineteenth-century coal miners, the inciting fear wasn’t about the dead canaries; the birds were just a disposable commodity.
What made them care in the first place were graveyards full of dead coal miners.
It is said there are two ways people can learn not to do things.
The latter method is considered preferable.
The problem is, these two options are only for learning by individuals. When it comes to whole societies, there seems to be only one of the options available… And it’s not the preferred one.
Societies learn by hands-on experience (pardon the pun). It’s not that history fails to provide national-level cautionary tales — very much the opposite. It’s just that regardless of relevant historical warning signs, nations and their leaders always seem able to cook up plausible-sounding dismissals that “things were different then/there” and sweep aside the warnings.
After all, history is endlessly debatable, and the lead-ups to most grand-scale disasters aren’t as clear-cut as hand on a hot stove.
Today, global climate change is probably the most looming, and least acknowledged, societal train wreck in progress. And true to form, humanity seems determined to burn its own damned hand on its own damned stove.
Environmental changes have spelled doom for human societies before — at limited, regional levels. (Jared Diamond’s 2003 book Collapse contains sufficiently chilling examples for whose who want to ruin a few nights’ sleep.)
But these stories aren’t well-known enough, they aren’t scary enough, and they aren’t connected enough in the popular imagination to our species-wide bad habits. They’re not an immediate, whack-you-in-the-face threat like the dead bodies of your coal mining buddies.
And so we continue to reach toward the environmental stove with our own damned hand. (In fact, it’s a better analogy to say both hands. We’ve got exactly one planet; there is no back-up hand.)
Luckily, there is a middle-ground method of learning, which societies do seem capable of. It’s the inoculation model — akin to burning your fingertip, but recoiling in time to save your hand.
…had still better scare the bejeezus out of us.
Getting these historical fingertip-burns right is a tricky thing. Societies are fickle, disperse in their attention, and easily distracted. Death and dismemberment are almost always part of the recipe necessary to catch a whole society’s interest to the point where its self-defense instincts are triggered.
Horrible as each of them was to the victims, I would cite both the bombings of Hiroshima and Nagasaki and the recent Ebola scare as successful historical inoculations.
With Hiroshima and Nagasaki, the world saw immediately just how serious these atomic weapons technologies were. The smoky silhouettes of incinerated civilians on still-standing walls were as motivating to strangers half a world away as dead coal miners had been to their small-town coworkers three-quarters of a century before.
And despite some close calls, this terrifying debut has kept our collective hand off the nuclear stove for over 70 years since.
With the recent Ebola virus outbreaks — though some have criticized the health care community’s response as “panic-mongering” — the fearful prospect of a widespread epidemic has marshaled public safety to a far higher level today than it would have been at otherwise. “False alarms,” when it comes to biological pandemics, are something we should be cheering for, not complaining about.
(By contrast, the resources brought to bear against a distinctly regional threat like Isis are way disproportionate to the risk. Isis’ methods are designed to be newsworthy, but the organization is not a threat to people living in New Brunswick, or Johannesburg, or Detroit. For a pandemic virus, the opposite is true. And Global Climate Change, in this sense, is a lot more like Ebola than it is like Isis.)
The sinking of the Maine, a U.S. warship parked in the Havana harbor, sparked the Spanish-American War (Cuba, back then, was a Spanish colony) and led to an unmitigated American ass-kicking of the formerly first-rate global power that was Spain.
The Maine exploded and sunk under debatable circumstances — and may have been an honest-to-goodness technical accident rather than foul play— but nevertheless, “Remember the Maine!” was trumpeted on newspaper headlines as a rallying cry for American patriotism.
It met the 1890s standards for a fingertip-burn that merited an immediate, dramatic response.
All the examples that I’ve given so far — Hiroshima, Ebola, the sinking of the Maine — had a human body count. Maybe that’s a prerequisite for societies to wake up and smell the coffee.
But maybe, just maybe, times have changed?
Could the tragic death of something non-human (but well-loved) cross the fingertip-burn threshold?
If you’re not a coffee snob, chances are good that you know one. And any self-respecting coffee snob will tell you there are two primary strains of commercial coffee:
If you’re drinking coffee as you read this, odds are good that it’s Arabica.
This also means that your cup-of-joe has been bred from a very distinct lineage, originating in the mountains of Ethiopia. These recent, confined origins mean that our commercial coffee has very little genetic diversity and is particularly vulnerable to climate change. Put simply — a disruption that kills one of ’em is likely to kill all of ‘em.
Arabica plants grow best in a narrow temperature range between 18 and 22 degrees Celsius, and they require gentle, regular rainfall.
With global weather patterns destabilizing, Arabica is in jeopardy. Researchers predict that agricultural lands capable of supporting Arabica could fall by half in the coming decades. To make matters worse, short-sighted efforts to improve coffee yields from the diminishing acreage could speed soil depletion and further accelerate shortfalls in production.
As Arabica’s availability dips, will we switch to the bitter Robusta alternative? Will we grit our teeth and just keep paying more for the dwindling supply, until we long for the days of free refills and coffees that cost only $3.50?
And to make matters worse, some climatologists predict that the world’s prime coffee-producing regions — places such as Vietnam, India and Central America — will be among those hardest-hit by climate change.
Are coffee’s days numbered?
To some, all this may sound like panicked fear-mongering. After all, the world drinks three-quarters of a trillion cups of coffee annually. We couldn’t just run out…
It’s happened before.
Not with coffee, but with something almost as unthinkable.
I can hear you scoffing. “You’re not fooling me,” you say. “We’ve still got bananas.”
Well, yes and no.
The bananas we eat today are not, it turns out, our great-grandparents’ bananas. It wasn’t long before the aforementioned Spanish-American War that bananas got globally popular. In fact, a worldwide banana craze led to the initial development of refrigerated shipping. But I digress.
At the time, the world was hooked on a strain of banana known as the Gros Michel. (There are many, many strains of bananas — thousands of them. Most, you wouldn’t want anything to do with.)
But in the 1950s, a banana-blight began ravaging the Gros Michel strain. It got worse, and worse, and, well… worse.
Devoted horticultural scientists, bullet-sweating banana plantation owners, and all the financial might of companies like Chiquita were unable to save the Gros Michel.
Season after season, the world watched helplessly as the beloved banana strain went extinct. We were powerless to stop it.
The stopgap was a second strain, the Cavendish. It’s the one that we know today, simply, as “bananas.” The Cavendish is somewhat smaller than the Gros Michel. They rot faster, they bruise easier, and according to those who lived at a time when Gros Michels were available to taste-test and compare, our modern Cavendishes taste worse. Gros Michels were sweeter, with a creamier texture.
They were, apparently by unanimous consent, a better banana.
But it was our modern Cavendish that was immune to the banana blight. And so, like the rat-ancestors who inherited the earth when the dinosaurs died out, by the early 1960s the Cavendish inherited the title of “banana” to a dispirited but option-less global consumer public.
The Gros Michel banana blight was bad. If you owned a banana plantation back then, it was downright disastrous.
But it was an isolated disease, affecting a solitary crop. An important crop, sure — but it wasn’t a domino poised to tumble an entire ecosystem or undercut global food production.
In short, it wasn’t worth freaking out about.
It didn’t qualify as a fingertip-burn.
These were the words of Winston Churchill, but it encapsulates an idea as old as politics.
Bad news is a great motivator. A society amped up on adrenaline, fear, righteous indignation — is a society ready to get stuff done.
So imagine the crisis of a Coffee-less Future.
We are a world of addicts. (Let’s be honest, okay?)
And Climate Change — the red-handed culprit should this happen — is a villain who won’t stop at one crop, or one industry, or one continent…
The loss of coffee wouldn’t just be a disaster to breakfast, an existential threat to baristas, or a stock sell-off for SBUX…
It could be a fresh batch of dead coal miners.
A culinary mushroom cloud.
I like coffee. I drink it. I’ve even been known to photograph it on occasion. But if a few billion caffeine-withdrawal headaches could snap us all to attention on the inadvisability of playing chicken with the global environment…
I’ll be happy to switch to tea.
If we can keep our collective hand off the stove for the comparatively low price of an extinct beverage, we should count ourselves lucky.
Maybe a tomorrow without coffee is exactly the rude awakening we all need.
There’s something it’s hard not to notice when you speak with people about psychedelics.
Most pop culture portrayals of psychedelics have discussions that begin (and all too often, end) with the word “Duuuuuuuude.” This may originate with the character “Shaggy” from Scooby Doo, the cultural progenitor of cartoon druggies. Something about Shaggy clearly struck a chord; he’s been spliced and cloned into dozens of equivalent, well-meaning imbeciles across all media ever since.
But with all due respect to mystery-solving dogs and their human sidekicks, when you talk with real users of psychedelics, the topic expands well beyond “Duuuuude.” People are eager to talk about their experiences. And it’s very rarely just “I was so fucked up” or “I partied like it was 1999.”
Well, sometimes it is that, but those things are the jumping-off point, not the follow-through.
People who become psychedelics aficionados — those who maintain an interest after their last rave or beyond their first bad trip — don’t want to talk about shiny colors or how their house cat suddenly turned telepathic. They want to talk about what their psychedelic experiences have taught them about themselves.
It’s a weaker punchline than “Duuuuude.”
And often a lot more confusing, long-winded, and deeply personal.
But as these are what real (i.e. non-cartoon) psychedelics users find comment-worthy about their experiences, it seems worth paying attention to.
What follows is my pet theory on why psychedelic experiences can be so transformative for people. But first, a question:
Why do people go to psychologists — or even to friends, family members, and others who know them well — to get advice on their own lives?
With all due respect to mystery-solving dogs and their human sidekicks, the topic of psychedelics goes well beyond “Duuuuude.”
First, because — whether we’re narcissists or self-haters — we’re all deeply interested in ourselves. And it’s always fun to get other people to discuss this best-loved of topics with us.
And second, because we’re extremely biased when it comes to ourselves. We are not good judges of our own behavior, or recognizers our own idiosyncrasies. We are the water we swim in — and we are thus both omnipresent and invisible in our lives. With less freedom than Peter Pan’s shadow, we follow ourselves around 24 hours a day.
At some point, you’ve done the optical illusion where you stare at a high-contrast image for 30 seconds, then look at a white wall, and you can see the “burned-in” negative image of whatever you’d been previously looking at. (If you haven’t done this, were you never a kid?)
Your brain’s optical system — even in that short time span — had constructed a sort of overlay to “balance out” the strong contrasts in your visual field. This is similar to how a camera automatically controls for exposure, so overly-bright parts of an image don’t “white out” and lose detail. In your own optical system, this contrast-reduction helps sensitize you to variations in your visual field. “Variations,” in this case, meaning movement. (Maximizing awareness of nearby moving objects probably needs no justification. Think: predators, prey, pies-in-face, etc.)
So here’s the analogy: The nuances of our own behavior are the constant, unchanging elements in our own experiential world. From our point of view, that is.
That’s why someone else’s opinion — the friend, the psychologist, even the stranger who tells you that your fly is open — can be so incredibly valuable. We are a moving, high-contrast object in the perceptual experience of that person’s life… so we show up to them with greater clarity. We “pop off the background” in a way we can’t do for ourselves.
Just like they do for us. And just like they can’t do for themselves.
However, for all the upsides of getting an outside perspective, there is undeniably value in self-reflection as well. Although it’s seldom without effort, we can identify things about ourselves that others can never tell us, because we’ve got a huge advantage over them…
We have access to a far greater data-set about our own world, our own behavior, and our own experiences than any outside observer has. (At least, this was true in the era before smart phones. Nowadays, my iPhone may know more about me than I do — in a facts-and-figures sort of way.)
The conjunction of these two tool-sets — the memory library we store about ourselves, and the perspective offered by someone else, watching from the outside — is ripe with possibilities for new revelations about our behavior. What’s effective, what’s ineffective, what as-yet-untested strategies may prove to be effective, and why.
Psychedelics — in my theory — cuts out the middle-man.
During a psychedelic experience, the user’s view of reality is profoundly affected, like looking through a prism, or a kaleidoscope, or (to keep with the idea of an outside perspective) someone else’s eyeglasses.
And yet — here’s where the magic happens… Looking from that outside perspective, the psychedelics user gets to page through his or her whole catalog of self-knowledge. The smorgasbord of memories and details even close friends don’t know — whether because these things are too private to admit, or too mundane to ever come up in conversation.
This fertile blend of outside perspective plus inner knowledge is the essential recipe for the insights that psychedelics can sometimes provide. Of course, a psychedelically-skewed perspective could also be so confusing as to be useless. Your pet goldfish will see you with a perspective that’s even more alien than your psychologist’s — but your goldfish’s perspective is less likely to be instructive when optimizing your behavior.
“Know thyself” is a quote attributed to Socrates — and to nine or ten other long-dead thinkers. Self-interested as humans are, there’s no reason to think that just one person, or one culture, came up with this idea. That’s probably what makes it such good advice.
The psychedelics user gets to page through his or her whole catalog of self-knowledge…
Psychedelics are one means for people to know themselves better. Maybe not the best means, certainly not the only means, and for some people, not even a safe means. My comparison of psychedelic insight to psychological counseling is no more or less serious than my comparison of psychedelic insight to talking with a goldfish.
To put that another way: some psychedelic “insights” might not make it past the Duuuuuude threshold. But the same will be true of talking with a psychologist, or of any other path to self-knowledge.
If knowing one’s self were easy, everyone would be doing it.
Psychedelics aren’t easy.
And neither are they a direct path to meaningful insight, any more than the discovery of fire was a direct path to the steam engine. My point isn’t to evangelize, or even to recommend – it’s just to propose a mechanism behind the age-old, cross-cultural claims of the value of psychedelic “visionary” experiences.
Of course, there are also probably epochs-old, cross-cultural versions of people saying Duuuuuuude and their friends laughing at them.
But ultimately, the punchline is the process.
It’s the mental discombobulation of psychedelic states that gives them their utility.
Somewhere on the biochemical middle-ground between sobriety and being “completely fucked up,” a psychedelics user may just find himself on an optimal cognitive plateau, offering an unexpected view toward self-discovery.