Let me introduce myself. My name is Mark Sisson. I’m 63 years young. I live and work in Malibu, California. In a past life I was a professional marathoner and triathlete. Now my life goal is to help 100 million people get healthy. I started this blog in 2006 to empower people to take full responsibility for their own health and enjoyment of life by investigating, discussing, and critically rethinking everything we’ve assumed to be true about health and wellness...Tell Me More
Let’s continue the discussion from last time. Again, I apologize for any meandering. This is a big topic, and I think it helps to leave no stone unturned.
Seasonal eating is currently pretty popular, perhaps even trendy in some circles. You’ve got the locavores, folks who only dine on meat and produce grown and harvested within a certain radius (generally fifty or 100 miles). They don’t necessarily set out to eat by the seasons, but that’s how it works out when you’re only eating local stuff. Others are committed seasonalists (yeah, I may have made that term up), specifically choosing foods that would only be available that time of year. There are even a small number of strict ancestral seasonalists, who only eat those foods which were seasonally available to their ancestors. A lot of Primal dieters fall into this category, and they generally do it for health.
The vast majority of seasonal eaters and locavores are motivated by environmental or social concerns. By eating seasonal, local food, they’re trying to reduce their carbon footprints or stimulate the local economy. I’m all for keeping things local, but I’m really interested in seasonal eating for health reasons. Does seasonal eating optimize health?
It’s a tricky question, and I’m not sure there’s a definite answer. You’d have to establish the definitive seasonal diet, and I’m not even sure such a thing exists. There’d have to be a single global seasonal cycle, but that’s obviously not the case. Seasons change, roughly in accordance with latitude, or distance from the equator. Regions close to the equator tend to be warmer year round, with wet and dry seasons, while regions further from the equator tend to have higher temperature variations. As I mentioned last week, we evolved in a mostly temperate climate studded with intense periods of drought and moisture. The landscape was varied (grasslands, forests, shrubby desert), but the warm weather allowed a fairly steady supply of plant and animal life. Wild plants, edible tubers, small lean game, large fatty game, fruits, and nuts were all available.
Okay. Let’s get this started. I’m just going to let loose with some stream-of-consciousness style speculation. I’ll try and throw in some links where they’re applicable, but I ain’t making any promises. (Hey, I just reread “On the Road” and Coltrane is on, so I’m in that mood). This isn’t to be confused with medical advice or scholarly prose.
My initial thought was that fruit (and therefore fructose) availability historically meant winter was coming. For more northern climes, like in, say, prehistoric Europe, this was definitely true. Let’s look at berries, everyone’s favorite Primal source of fructose. When are wild berries available? European wild berries flourish in the sun and are generally picked in late summer or early fall (according to this guide to the wild berries of Finland, where summers are short and warm, and the winters are long and cold), right as the weather begins to turn cold. In European forests, there are several species of naturally occurring wild fruit trees. The malus (apple family), prunus (plums and apricots), pyrus (pear family), and sorbus (rowanberry) all grew and still grow in Europe, and their fruit all ripens in late summer and early fall. If Euro Grok was eating fruit, it’s pretty clear he ate it seasonally.
We all know what a high fructose intake can promote: insulin resistance, weight gain, metabolic syndrome. Sounds pretty bad, right? In northern climates, however, a little bit of seasonal metabolic syndrome accompanied by a nice layer of adipose tissue might have been protective against the cold and the coming dearth of edibles. It wasn’t the chronic metabolic syndrome of the industrialized nation. It was seasonal, and it probably made a lot of sense for cold weather humans to eat as much fruit as they could to prepare for the winter.
But wait – if you add tons of Omega-6 fats to lots of fructose, metabolic syndrome gets even worse (or “better,” depending on how you look at it I guess). I wonder if polyunsaturated fat availability was seasonal, too. Since Grok wasn’t extracting oil from seeds using industrial processing, he had to get his PUFAs from whole foods, like nuts, seeds, and fowl. Nuts are certainly seasonal, and, at least in the US, they’re harvested mostly in fall. For cold weather Grok, this would place his greatest nut consumption in early fall, right in line with his elevated fructose intake. The combination of Omega-6 and fructose would represent a potent cocktail for pre-winter weight gain. (Before they hibernate, bears gorge on nuts, honey, berries, and fruit. Their metabolisms slow and they enter what might be described as a pretty intense bout of metabolic syndrome. I bet their triglycerides are sky high!)
What about today? Is there still an advantage to getting pudgy for the winter by overloading on fructose? I’m not sure, but I doubt it. We generally stay warm with clothing and heaters, and most people have access to plenty of food throughout the winter without needing to truck around a couple dozen pounds of fat energy on their person. I tend to think that it was an adaptive behavior, a cultural (albeit unwitting) reaction to seasonal changes. It conferred external benefits to humans living in cold climates (without steady food or access to shelter) but I don’t think the same thinking necessarily applies to humans (even descendants of Euro Grok) living today with plenty of food, shelter, and warm clothing. Remember, as far as we know Homo sapiens have only lived in cold climates with distinct seasons (like northern Europe) for 40,000 years, while the bulk of our genome was established in the 200,000 years spent in central and east Africa in temperate climates with wet and dry seasons, so if we’re genetically adapted to any seasonality, it’s going to be that one. We can’t fall into the trap of looking only to the prototypical hairy Grok stalking mammoths across frozen tundra. You can’t forget about the tropical, warm-weather Grok, with whom we all arguably share far more commonalities, regardless of ethnic background.
That brings up another point: cold weather humans were eating fructose and polyunsaturated fats in the relative absence of sunlight. That means little to no Vitamin D (whatever we could wrest from dietary sources). What do we know about Vitamin D and fructose? Well, when compared to glucose, increased intake of dietary fructose inhibits calcium absorption and induces Vitamin D “insufficiency.” You eat a ton of fructose – you need more Vitamin D to make up for it… unless the goal is to get insulin resistant, put on some weight, and stock up your energy stores for the coming winter.
Maybe seasonal (“protective”) metabolic syndrome is the result of eating fructose (along with PUFAs) without Vitamin D to quell the effects. We already know that European hunter-gatherers were under pressure to wring every last drop of Vitamin D from their environment, which is probably why they have white skin. Vitamin D wasn’t readily available, and for at least half the year it was unobtainable for lack of sun. If you look at our earliest tropical forebears, however, they had year round access to sun. They also had greater access to fructose.
That’s how tropical Grok enjoyed his fruit – with the sun blazing overhead. In fact, any traditional hunter-gatherer group that consumed fruit or fructose year round did so in a temperate, “seasonless” climate. Take the Efe, from the Democratic Republic of Congo’s Ituri Rainforest (average temperature: 88 degrees F), who can derive up to 42% of their caloric intake from raw, wild honey. The Efe also happen to exhibit the L1 haplotype, long considered to be the oldest genetic haplotype, and the 90,000 year-old Semliki harpoon, one of the earliest known Homo sapien tools, was discovered in traditional Efe hunting grounds.
What does this all mean to us modern humans? I think it means that strict (European) paleo reenactment (thanks to Kurt Harris for that term) by avoiding the sun for half the year and gorging on supersweet fruit in the fall is unnecessary, or even harmful (unless we need metabolic syndrome’s “protection”). Are you holing up in some hut out in the tundra this winter? Are you a black bear with the ability to read? If so, then go ahead and avoid sun and fill up on fructose and nuts, because you’ll probably need the body fat. For the rest of us, however, we just need to be aware of the interplay between the seasons, fructose, and our metabolisms. Low sunlight and low vitamin D coupled with high fructose intake tells the body that winter’s a’ coming. If we want to eat fruit, it probably makes sense to get plenty of Vitamin D, too.
Cold weather fructose consumption patterns weren’t ideal; they were just optimized to make the best of a tough situation. I’d argue that eating fructose the cold weather way (intermittently, with low Vitamin D levels) doesn’t make sense for most people today, and it may even be a big cause of modern obesity levels (instead of gorging on wild raspberries and walnuts while huddled in freezing caves, we guzzle soda and eat PUFA-laden French fries while sitting in air-conditioned homes).
What do you think? Is there something inherently beneficial to intermittent reenactment of northern European fructose consumption patterns, or do you agree that they are cultural adaptations to the realities of harsh winter conditions? Next week, I’ll continue the discussion.