"Your Health is Your Responsibility" ?
When my Doctor (G.P. at Kaiser) said this to me last winter, I thought it was just a particular response to our relationship. After all I was defying them by not following their diet and I was not taking their strongly recommended statins.
But in the last few weeks I've learned that this is a common policy all over the country, both at Kaiser and with other Doctors. When did this happen? Doctors certainly used to think and act as if my health was their responsibility. Now my G.P. asks me if I want a blood test he suggests, and if I don't say anything he doesn't order it. Is Obamacare influencing this? Is this the policy now everywhere?
About 20 years ago Dentists adopted this policy. So I took my teeth back from the worlds' dentists and have done much better. But it seems unfair to ask me to take over the care of my whole body. I never ever learned anything at all about human anatomy, biology or bio-chemistry before reading Primal Blueprint.
Last edited by Cryptocode; 09-25-2013 at 05:04 PM.
"When the search for truth is confused with political advocacy, the pursuit of knowledge is reduced to the quest for power." - Alston Chase