The idea that vegetables are an essential part of a healthy diet has been hammered into our collective consciousness by every authority out there. Parents, teachers, scientists, government health “experts” all stress the importance of eating your veggies. Problem is, they also told us that butter would kill us, margarine would save us, animal protein would give us cancer, and animal fat would give us heart disease. They said we should jog for an hour a day three days a week, that deadlifts would hurt our backs, and that we need to wear shoes with “good arch support.” Basically, conventional wisdom gets it wrong an awful lot of the time, so what should we think about the CW regarding vegetables? It’s a fairly common query I receive from readers:
Do you really need to eat vegetables – or plant matter in general – to be healthy?