Let me introduce myself. My name is Mark Sisson. I’m 63 years young. I live and work in Malibu, California. In a past life I was a professional marathoner and triathlete. Now my life goal is to help 100 million people get healthy. I started this blog in 2006 to empower people to take full responsibility for their own health and enjoyment of life by investigating, discussing, and critically rethinking everything we’ve assumed to be true about health and wellness...Tell Me More
Evolution and seasonality are inextricably intertwined. This isn’t a negotiable, controversial statement, because evolution describes an organism’s response to environmental pressures, and the seasons are part of the environment. Another uncontroversial statement is that the study of human evolution can give us insight into what constitutes a healthy lifestyle for modern humans. I think it’s reasonable, then, to suggest that understanding how seasonality affected human evolution might give even more insight into best practices.
Most examinations of prehistoric climate change deal with average global temperatures, which can explain overall worldwide trends in climate, but when we’re talking about human evolution – that is, on the changes in the human organism that resulted from immediate, localized environmental pressures – knowing the mean global average doesn’t tell us much. To understand how seasonality affected our development, we need to look beyond the global trends. We need to look at specific climate conditions.
The seasons change in many ways. There’s the obvious one – winter to spring to summer to fall – but how that inter-seasonal transition plays out depends on the overall climatic conditions of the environment. That is, winter has meant different things in different regions and at different times in history. Some places, winter is cold and dry, others warm and wet. Seasonality depends heavily on climate.
Okay, so let’s take a look at our data. We’ve got a glacial period lasting 111,000 years, or half the time modern humans have been around (the last half). Throughout this glacial period, common geological features in the north included glaciers, huge sheets of continental ice, changing warm/cold seasons, and arid conditions, all of which make vegetation seasonal and life fairly difficult.
Man grew up in the tropics. Yeah, there are subtropics and neotropics and whatever other distinctions you want to make, but the bulk of our evolution took place in tropical Eastern Africa, where and when it was warm. We also came of age during a glacial period that only just (11,000 years ago) ended. That glacial period was part of a larger ice age that began around 2 to 3 million years ago. We’re still in an ice age, technically, though popular parlance gets “ice age” and “glaciation” mixed up. An ice age is composed of glacial and interglacial periods. Today, we’re in the middle of an interglacial period.
That last glacial period (what we generally refer to as “the ice age,” incorrectly so) began around 111,000 years ago and lasted 100,000 more. Modern humans (Homo sapiens sapiens) have been around for 200,000 years – that’s us. So, about half our time on Earth has been spent dealing with a glacial period. What’s involved in a glacial period, you might be wondering? Well, popular notions of glacial periods include barren tundra, steadily encroaching ice sheets, unstoppable glaciers (hence “glaciation”), hairy men (and women, I suppose) in animal pelts, seasonal vegetation, and wild game with massive stores of saturated back fat. For the northern latitudes, this is pretty accurate imagery. Canada and the northern United States were completely covered by ice. The Scandinavian ice sheet spanned the British Isles, Germany, Poland, Russia, and western Siberia. The Himalayas, Caucasus, and Alps experienced considerable glaciation. Glaciers reached Taiwan, the Japanese Alps, as well as the mountains of Morocco, Algeria, and Ethiopia. The hominids living in the affected areas, then, probably embodied the classic “caveman” lifestyle (the ones who survived, that is).
Sixty thousand years ago, when Europe was icy and forbidding, modern humans weren’t there. Neanderthals were, though, and they were undoubtedly made for the region. Bulky, robust, heavy set, muscular, with pronounced brow ridges – these guys were your archetypal cavemen. But they were not modern humans. When we finally did head northward out of Africa into Europe, around 40,000 years ago, we actually displaced the extant Neanderthals. We mingled and interacted with them along the way, and we may have even interbred with them, but we are not Neanderthals. Those early Europeans were still Africans, genetically, as the famous Hofmeyr Skull showed.
So, what does seasonal, evolutionary eating actually mean? To whom do we look for ultimate guidance?
In the Primal community, there’s a tendency to hone in on the European hunter-gatherer experience for guidance in all things dietary. The big-game hunting, cave-painting Cro-Magnon is the first thing that comes to mind for most of us. That’s fine, to a point, but not when it means excluding from consideration of other hunter-gatherer populations living in completely different climates. We have to take it all in. It’s all relevant. They’re all humans.
If it’s human, it’s relevant, and we have to pay attention.
East Africa, the predominant site of human evolution, experienced the seasons as wet and dry, rather than hot and cold. It was always warm. There wasn’t widespread glaciation, except in the mountains. There were no African ice sheets. Glacial periods affected African climate, sure, but not by creating a winter wonderland. Glacial periods manifested as droughts and in the development of arid deserts and grasslands. Vegetation and game were available. Now, drought and desert undoubtedly altered the scope of human evolution by heavily impacting the humans (our ancestors) living there; it’s just a mistake to assume glacial periods meant fur coats and holing up in caves for the winter for everyone worldwide.
Put another way, when eating seasonally, do we eat according to the seasonal patterns experienced by our East African ancestors or our European/North American/Australian/Asian ancestors? Do we look to the past as a road map, or do we merely eat what’s in season at local farmers’ markets?
I’m not sure, but I’ll venture the safest guess I can muster.
All of the above. Everything matters. One thing is for certain, though: we’ve all got that African Homo sapiens sapiens blood running through our veins. Each of us – irrespective of nationality, ethnicity, or recent ancestry – has several hundred thousand years of tropical evolution to account for. That’s when we developed our taste for animal flesh and our big beautiful brains. But we’re adaptable creatures, us humans. We can thrive on different diets with different macronutrient ratios.
As long as you stick with the basics and avoid those foods that weren’t available, regardless of season, stuff like refined sugar, vegetable oils, grains, and legumes, everything else is just fudging with the margins. Keep one eye on the tropics and another on the Paleolithic climatic region of your choice. Could a descendent of Northern Europeans, a regular Norseman, thrive on a tropical diet of fish, coconuts, pork, and yams? I bet he could. Could a Native American grow old and strong on the modern Primal hybrid eating plan of Big Ass Salads, omelets, and crock pot recipes? Sure, why not.
Seasonality shouldn’t be limiting. The fact that our ancestors evolved with perennial warmth and were still able to thrive in regions with actual seasons means we can handle just about anything. It means we can eat according to any season as long as we remember the basic nutritional laws that bind us all together, rules that were initially written in the tropics and then expanded upon in myriad other climates, seasons, and regions.
Thank you for reading this series of posts in which I explored the role seasonality plays in the human diet and health. If you missed any you can catch up here: