We all make poor choices against our better judgment. It’s kind of what makes us human – the tendency to actively and willfully make decisions that will result in unfavorable outcomes. Sure, the candy bar tastes good, but you know you’ll feel awful after eating it. Yeah, that blog is fun to read, but you know you’d be much happier if you finished that essay for class first. And yet five minutes later, a candy bar wrapper sits, emptied of its contents; your molars house fragments of nougat and sport a caramel sheen; light nausea approaches; and you find yourself wading knee deep through comment sections, MS Word window minimized. What just happened? Why did you do those things that you told yourself you wouldn’t, that you warned yourself against, and whose negative ramifications are already coming to fruition – just as you predicted?
Last week, we began the dialog with my introductory post on akrasia – the act of knowingly working against one’s own interests – but we didn’t get into any details. Today, I’m going to try to provide a few answers. I’m going to delve into the reasons for akrasia, particularly as it pertains to making bad eating choices. I won’t discuss psychological issues, per se, instead focusing on physiological explanations, but keep in mind that the two are often one and the same. You can’t really separate the mind from the body (well, without killing the person, that is).
Whether we pick up the phone to order takeout, open the candy wrapper, shove the spoon into the jar of Nutella, or accept the offered slice of cake, we are making a decision. Most health experts say making the healthy decision is a matter of willpower. So that if you make an unhealthy decision you simply don’t want it badly enough. Like Bob Newhart in that old Mad TV sketch, they seem to think all you have to do is just “STOP IT!”
Well, it’s not that easy. Otherwise, folks wouldn’t be making these decisions that go against their better judgment. Otherwise, they’d indeed be “stopping it.”
So why do we do it?
Many – perhaps most – poor dietary choices stem from an inability to resist cravings. And who can blame you, really? Whether they’re for chips, sweets, or something specific like wheat, cravings are difficult to ignore by design. Their very purpose is to get you to give in to them, to override your rational side and promote decisive, single-minded pursuit of whatever it is you crave. Something, then, is at the heart of these cravings. Something physiological. But what?
There’s often a disconnect between what our animal bodies need or desire and what our human minds know is best. When the animal body perceives a deficiency, some nutrient lacking in the diet, like salt, it often develops a craving for that nutrient. 20,000 years ago, if you were salt-deficient you would have gone looking for shellfish or rock salt, because those are the salt sources you knew. Your food memory bank was rather limited in scope. Today, that same salt deficiency might manifest as a craving for Pringles or Cheezits, because those foods are listed under “salt” in your food memory bank.
Let’s look at some research on the subject. In one study (PDF), human volunteers were put on a strict low-sodium diet and treated with diuretics for ten days, rendering “substantial sodium depletion.” The effects were pretty telling. Salt thresholds – the minimum detectable level of sodium chloride dissolved in water – lowered dramatically; the subjects could detect lower levels of salt during sodium-depletion than they could during sodium-repletion. Furthermore, salt depletion made salty foods taste better than they had before the study, and salt-depleted subjects rated the saltiest foods as the most attractive and desirable.
It’s quite possible that your “Pringles cravings” are actually salt cravings, and that the former is simply what your animal body associates with “salty.”
What about sweet cravings? Paul Jaminet thinks that sugar cravings might actually be fatty meat cravings. It sounds crazy on the face of it, but he makes some salient points. First, certain amino acids are actually slightly sweet. These sweeter amino acids are also hydrophobic, which means they are found inside cells with fats, and they repel water (fat doesn’t mix with water). Hydrophilic amino acids, which are water-soluble, do not associate with fat, and trigger the umami tastebuds, are not sweet. A leading theory of sweetness even suggests that in order for a compound to be sweet (to interact with sweetness receptors), it must be hydrophobic. Paul suggests that in a Paleolithic environment with ample prey, bland (rather than sweet) tubers and less abundant/seasonal fruits, cravings for sweets drove us to eat calorie-dense, nutrient-rich fatty meat.
It’s possible, yet again, that our animal bodies are confused by the modern (and totally understandable) conflation of sweet with sugar and misinterpret what is actually a need for fat. Perhaps those sweet cravings turn into sugar binges because sugar isn’t actually what your body wants.
Wheat contains opioid peptides that may be able to activate opioid receptors in our bodies. You know what else activates opioid receptors? Opium, morphine, and heroin. (I’ve never tried any of them, but I hear they can inspire some real devotion from their users. See: Trainspotters, Requiem for a Dream.) I know that may sound glib, and I’ll be the first to admit that research into this is still very preliminary. You won’t find any ironclad evidence on PubMed that wheat is addictive. But the thinking goes that rather than hitting you like a ton of bricks and rendering you speechless from the sublime triggering of your opioid receptors, wheat addiction manifests as a stubborn lingering thing.
Evidence does exist, however limited. One older paper (PDF) that identifies multiple opioid peptides in wheat gluten, suggests that they are capable of binding to brain opioid receptors via a “plausible biomechanical mechanism,” and deems them of “physiological significance.” Dr. Emily Deans, of Evolutionary Psychiatry, has actually used naltrexone – a drug that blocks opiate receptors – to curb wheat cravings in celiac patients who are trying to kick the “habit.”
Wheat plays a huge role in the diets of industrialized nations. If you’re reading this, you probably grew up eating it. You may still be eating it from time to time – and that may be at least partly responsible for your urge to eat that slice of bread.
Similarly to wheat, sugar has addictive properties. A review of the rat studies shows that rodents will become quite addicted to sugar rather quickly, at times even choosing it over pharmaceutical-grade cocaine. There’s evidence that the addictive properties affect humans, too. As with wheat, naltrexone has been shown to reduce the rewarding properties of sugar in people. When you block the opiate receptors in the brain, sugar simply isn’t as rewarding and you’re not driven to consume as much of it.
Sugar appears to be addictive in both rats and humans. You, being a human, could very well be drawn to make bad decisions about sweets because you are addicted to them.
Everyone knows about “stress eating.” Chronic stress is repeatedly linked to obesity and overeating, and there’s strong evidence that it even elicits cravings for specific foods or nutrients. Like sugar. Remember our old friend cortisol? It’s one of the premier stress hormones, and in high cortisol responders – people that secrete lots of cortisol in response to stress – cravings for and intake of sweets increase dramatically. Stress also appears to increase the desire for “comfort foods,” those deadly high-sugar, high-fat concoctions, via an increase in ghrelin, a hunger hormone.
Stress can also lead to salt cravings, probably because the adrenal glands which produce stress hormones also produce hormones which monitor electrolyte balance. And indeed, stress can also increase salt requirements, which, as we know from earlier, can often manifest as “chips cravings” or “cracker cravings.”
My general rule is that starchy vegetables like tubers and potatoes, as well as sweet fruits, are elective foods. You don’t need ’em, and most people, especially those who are trying to lose weight, will be better off limiting them. They can be tasty, though, and if your activity levels warrant a higher intake of carbs, you could eat them. I have no problem with that and I don’t see them as problematic in that situation. In fact, if you’re doing daily Crossfit WODs or pounding the pavement to the tune of 100+ miles each week, you had better eat some tubers and some fruit. If you don’t, if you go very low carb while trying to maintain that breakneck pace, you will suffer. You will probably also crave easily-digestible, refined, processed junk carbs. Think chips, bread, pizza, pasta, or – my own personal favorite/nemesis from my Chronic Cardio days – tubs of ice cream.
Your body needs to replenish the glycogen, and it needs carbohydrates to do it. Gluconeogenesis can only get you so far if you’re pushing your body to its limits. In the face of heavy, glycogen-depleting training, a lack of Primal starch sources will have you craving sweets and grains in no time.
Lack of sleep has long been associated with overeating and obesity. For one thing, poor or disrupted sleep schedules promote disrupted cortisol secretion, which – as I’ve shown above – can affect our food choices. Bad sleep also increases insulin resistance, which changes how we process macronutrients (especially carbohydrates) and renders us more prone to fat gain. And now, a recent study has shown that a single bout of acute sleep deprivation (just one night) causes people to find food more rewarding. Patients on no sleep derived more pleasure from food, desired more food, and reported more hunger than patients who had slept. And that was just a single night. Just imagine the effects of days, weeks, or even years of chronic poor sleep.
If you’re running on no sleep, you may very well be more susceptible to the wiles of junk food.
Peer pressure doesn’t just occur in groups of teens smoking joints behind a 7/11. It can happen at birthday parties, at office events, or during the holidays. Wherever treats are being served, and the vast majority of those in attendance partakes, those who would otherwise refuse the offered treats often feel pressured to give in. You hem and haw, try to say “No, thanks,” but you start thinking you see shared glances between judgmental partiers, sense hurt feelings from amateur bakers, and you worry about looking like a “health nut” (as if that’s a terrible thing or something), so you take the slice of cake or square of brownie and partake. You know what happened last time you gave in. You remember quite vividly the downward spiral of junk indulgence that transpired then, and probably will again. But still you eat it.
One explanation may be that social rejection – even if it’s only imagined – can manifest as physical pain. To figure this out, researchers ran brain scans on study participants as they played a virtual ball-tossing game and then began excluding them from play (PDF). Ultimately, all participants were excluded from the game. During both explicit social exclusion (in which players were prevented from participating by other players) and implicit social exclusion (in which extenuating circumstances prevented participants from joining the game), the brain scans registered significant activity in the anterior cingulate cortex (ACC), a region of the brain that acts as a “neural alarm system” or a “conflict monitor.” Whenever “something is wrong,” the ACC lights up. Physical pain famously triggers the ACC, but the ACC is not involved in the physical sensation of pain. It’s involved in mental distress.
Distress is a negative sensation. It is unpleasant by its very definition. If you’ve resisted the treats in the past and felt socially isolated or rejected because of it, you may be conditioned to take the treat next time in order to avoid the isolation and avoid the activation of your neural distress center.
Do any of these sound familiar? When it comes to making poor dietary decisions, keep in mind that we are complex animals and the causes of our actions are multifactorial. Some or all of these factors may play into your particular misstep. Maybe you gorged on cake at the party both because your ACC was buzzing in trepidation at the prospect of social isolation and because you’d been putting in way too many road miles, you were overtrained, your cortisol was spiked, your blood sugar was low, and you were craving sugar. It could be any number of things from this list (and even some that aren’t on it).
So, while the decision ultimately rests on your plate, you might find it helpful to understand that a whole host of factors is actively influencing you. These aren’t excuses, and they don’t remove responsibility, but they do show you what might be going on under the hood. Hopefully by understanding exactly why we often make bad decisions about food against our better judgment, we can tip the scales in our favor before the next one is made.