April 2008

You are currently browsing the monthly archive for April 2008.

pregnant woman

THS reader Roselle sent in this question:

Is vitamin/mineral supplementation truly beneficial before & during pregnancy for women with a healthy diet?

The first thing I’d like to emphasize is the importance of this question. Adequate maternal nutrition prior to conception and during pregnancy can protect the baby from diabetes, stroke, heart disease, kidney disease and memory loss later in life.

Intuitively, most mothers know that what they eat will have a significant impact on the developing fetus. And traditional cultures have been aware of this for millennia. Special preconception and pregnancy diets have always emphasized foods that are particularly rich in certain nutrients known to promote healthy growth and development. In some cases, these groups provided special nutrients for fathers preparing to conceive as well.

Traditional cultures with access to the sea used fish eggs. Those that consumed dairy products used high-quality milk from the spring and fall when grass was green and rapidly growing. African groups whose water was low in iodine used the ashes of certain plant foods to supply this important element. These foods were always added to a foundational diet rich in liver and other organ meats, bones and skin, fats, seafood and whatever local plant foods were available.

In the Winter of 2007, Chris Masterjohn published a fantastic article called “Vitamins for Fetal Development: Conception to Birth” in the Wise Traditions in Food, Farming and the Healing Arts Journal. Masterjohn remarks:

“Although modern science still has much research to accomplish in order to fully elucidate the value of traditional wisdom, it has already confirmed the fact that many of the nutritional factors that we now recognize as the most important to embryonic and fetal development are the same ones emphasized in traditional pregnancy and preconception diets.” (p.26)

What are these nutrients that both modern science and traditional wisdom recognize as essential? Briefly, they include:

  • Vitamin E: originally named “Fertility Factor X” in 1922 because rats could not reproduce without it. Recent research indicates it is almost certainly required for human reproduction.
  • Vitamin A: vitamin A is necessary for the differentiation and patterning of all the cells, tissues, and organs within the developing body. It is especially important for the development of the communication systems between the sense organs and the brain. Vitamin A deficiency during pregnancy has been shown to produce spontaneous abortion in several different species of animals.
  • Vitamin D: Vitamin D plays a role in lung development, and protects the newborn from tetany, convulsions and heart failure. Vitamin D probably plays a much larger role in fetal development than currently understood due to its interaction with vitamin A.
  • Vitamin K: relatively little is known about vitamin K’s role in embryonic and fetal development compared to vitamins A & D. However, cases of birth defects that occurred with mothers taking Wafarin (which depletes the body of vitamin K) suggest that vitamin K plays an essential role in the development of proper facial proportions and the fundamental development of the nervous system.
  • DHA: DHA may be necessary for the formation of neurons and for the synthesis of the important brain lipid phosphatidylserine. It is also the precursor to an important compound that protects the neurons from oxidative stress. The fetus hoards DHA from the mother and incorporates it into its brain at ten times the rate at which it can synthesize it.
  • Biotin: biotin is a B vitamin that has also been called “vitamin H”. Researchers have recently discovered that marginal biotin deficiency during pregnancy is extremely common. Biotin deficiency has been shown to cause birth defects in rats. Whether this extends to humans is currently unknown, but there is little reason not to increase biotin intake during pregnancy as a precaution.
  • Folate: the importance of folate during pregnancy is widely known. It is necessary for the production of new DNA, and new DNA is needed for new cells. Adequate folate intake prevents spinal cord and brain defects and increases birth weight. It may also prevent spontaneous abortion, mental retardation and deformities of the mouth, face and heart.
  • Choline: a low intake of choline during pregnancy is associated with a four-fold increased risk of spinal cord and brain defects. Choline plays a direct role in the development of the brain; in particular, for the formation of neurons and synapses.
  • Glycine: the amino acid glycine is “conditionally essential” during pregnancy. This means that while we can normally make enough of it ourselves to meet our needs, during pregnancy women must obtain it from the diet. It is required for protein synthesis in the fetus, and is almost certainly a limiting factor for fetal growth.

Based on the established role of the nutrients listed above, Masterjohn makes the following recommendations:

Nutritional recommendations for preconception and pregnancy

  • Take a daily dose of high-vitamin cod liver oil (available online from Radiant Life and Green Pasture) to obtain 20,000 IU of vitamin A and 2,000 IU of vitamin D, and 2 grams of omega-3 fatty acids (roughly 1 3/4 teaspoons per day).
  • Grass-fed animal fats supply vitamins E and K2; palm oil, fresh fruits and vegetables, nuts and freshly ground grains are also sources of vitamin E; fermented foods (cheese, yogurt, kefir, sauerkraut, etc.) are also good sources of vitamin K.
  • Biotin can be obtained from liver and egg yolks. Cooked egg whites can be obtained in moderation, and raw egg yolks (from organic, pastured chickens of course) can be added to smoothies and cream to boost biotin status.
  • Folate can be obtained from liver, legumes, beets and greens. Choline can be obtained from grass-fed dairy, egg yolks, liver, meat, cruciferous vegetables, nuts and legumes.
  • Muscle meats and eggs should be used along with skin, bones and gelatin-rich broths to obtain glycine.

The answer to Roselle’s original question largely depends upon what is meant by “a healthy diet”. The low-fat, nutrient-depleted diet that is currently considered to be “healthy” by the medical establishment is likely to be deficient in several key nutrients, particularly the fat-soluble vitamins A, D & K and the omega-3 fatty acid DHA. However, even a nutrient-dense, whole foods diet may need to be supplemented with additional foods or additional servings of foods already in the diet.

Most of these can and should be obtained from local and organic foods. The exception is cod liver oil, which one of nature’s highest sources of vitamins A & D and a rich source of DHA as well. Not all cod liver oil is created alike, however. Most commercial brands contain synthetic vitamin A & D, which are known to be toxic at high doses. Unfortunately, this means you will have to order high-vitamin cod liver oil from a reputable company online. The brands I recommend are Green Pasture High-Vitamin Fermented Cod Liver Oil or High-Vitamin Cod Liver Oil, and Radiant Life Cod Liver Oil.

Finally, I highly recommend obtaining the Winter 2007 “Wise Traditions” journal and reading the full article by Chris Masterjohn. It will eventually be available on the Weston A. Price Foundation website, but it can take up to one year from the original publication time for an article to be posted to the website.

pills on spoon

THS reader Chad sent in this question:

Antidepressants – effective or placebo?

The use of antidepressant medication has become so widespread and commonly accepted that it seems almost sacrilegious to question it. But alas, questioning is the name of the game here at The Healthy Skeptic!

And what do you know? Antidepressants aren’t all they’re cracked up to be. In fact, a recent meta-review of published studies on the efficacy of antidepressant drugs revealed that selective serotonin reuptake inhibitors (SSRIs), which are the most commonly prescribed drugs to treat depression, have no clinically meaningful advantage over placebo.

What that means is that in most of the trials reviewed, patients who took a sugar pill recovered from depression just as often as those who took the active drug. This study may come as some surprise to both physicians and the general public, whose faith in the efficacy of these drugs has led to over 118 million prescriptions in 2007 and over $16 billion in sales.

But should this really come as a surprise? Antidepressant drugs are thought to act by altering levels of brain neurotransmitters; however, it takes several weeks before these changes can be measured. Yet patients often report symptomatic relief within hours or days of receiving an antidepressant.

Available data suggests, in fact, that SSRIs are no more effective than placebos and have considerable adverse effects and risks, including increased suicidality amongst both children and adults. Sapirstein and Kirsch conducted a meta-analysis of 3,000 patients who received either antidepressants, psychotherapy, placebo or no treatment at all. They found that 27% of therapeutic responses were attributable to drug activities, 50% to psychological factors, and 23% to “non-specific” factors. In other words, 73% of the response to the drug was unrelated to its pharmacological activities – and antidepressants may be no better or more specific than placebos.

This of course raises grave questions about why the National Institute for Health and Clinical Excellence (NICE) still recommends that antidepressants should the be first line treatment for moderate or severe depression. Their message is identical to that of the Defeat Depression Campaign in the early 90s, which contributed to the 253% rise in antidepressant prescribing in 10 years.

In a review published in the British Medical Journal in February of 2006, researchers Joanna Moncrieff and Irving Kirsch point out that the NICE recommendations ignore even their own study data. Although the NICE meta-analysis of placebo controlled trials of SSRIs found statistically significant differences in levels of symptoms, these were so small that the effects were deemed “unlikely to be clinically important.”

After analyzing several published studies and reviews, Moncrieff and Kirsch reached the following conclusions:

Summary points

  1. SSRIs have no clinically meaningful advantage over placebo
  2. Claims that antidepressants are more effective in more severe conditions have little evidence to support them
  3. Methodological artifacts may account for the small degree of superiority shown over placebo
  4. Antidepressants have not been convincingly shown to affect the long-term outcome of depression or suicide rates

The response to a drug or placebo in a clinical trial for depression is often measured using the Hamilton rating scale, a multiple choice questionnaire which doctors use to rate the severity of a patient’s condition. The questionnaire rates the severity of symptoms observed in depression such as low mood, insomnia, agitation, anxiety and weight-loss; it is considered to be a highly reliable physician-rated scale and has been reported to be more sensitive than patient-rated scales to drug/placebo differences. (Murray, 1989)

In the NICE meta-analysis, the difference between drug and placebo groups was one point. The most commonly used 17 item version of the Hamilton scale has a maximum score of 52. It is highly unlikely that a difference of one point on a 52-point scale is clinically significant, a fact that the FDA has admitted in memoranda (Laughren, 1998; Leber, 1998) reviewed by Moncrieff and Kirsch.

Other studies have yielded similar results. A study by Khan et al. found a 10% difference in levels of symptoms between placebo and active drugs in two different meta-analyses. In a more recent review, Kirsch et al. invoked the Freedom of Information (FOA) act to obtain access to previously unpublished studies (the drug companies are under no requirement to publish a study they have sponsored if the results don’t suit them). The overall difference between drugs and placebos in that analyses was 1.7 points on the Hamilton scale.

Moncrieff and Kirsch also point out that the Hamilton scale contains seven items concerning sleep and anxiety, with each item on sleep scoring up to six points. Therefore any drug with some sedative properties, including many antidepressants, could produce a difference of two points or more without exerting any specific antidepressant effect.

Follow-up studies that track patients for a significant length of time have also shown very poor outcomes for people treated for depression both in the hospital and in outpatient settings, and the overall prevalence of depression is rising despite increased use of antidepressants. Suicide rates have increased in some groups and some countries, despite increased prescribing of antidepressant, and there are continuing concerns that SSRIs may increase the risk of suicidal behavior in obht cildren and adults.

In children, the balance of benefits to risks in antidepressant treatment is already recognized as “unfavorable”. The analyses performed by Moncrieff and Kirsch strongly suggests that the same is the case for adults, and that the ongoing uncertainty about the possible risk of increased suicidality as well as the adverse effects of antidepressant drugs warrant a “thorough re-evaluation of our current approach” to treating depression.

I couldn’t agree more. One question the authors failed to pose, which I believe to be at the root of the matter, is why are so many more children and adults depressed now than before? You might not be surprised to learn that I have some thoughts about this. But I’ll save them for another post.

morgueThe popular perception that the U.S. has the highest quality of medical care in the world has been proven entirely false by several public heath studies and reports over the past few years.

The prestigious Journal of the American Medical Association published a study by Dr. Barbara Starfield, a medical doctor with a Master’s degree in Public Health, in 2000 which revealed the extremely poor performance of the United States health care system when compared to other industrialized countries (Japan, Sweden, Canada, France, Australia, Spain, Finland, the Netherlands, the United Kingdom, Denmark, Belgium and Germany).

In fact, the U.S. is ranked last or near last in several significant health care indicators:

  • 13th (last) for low-birth-weight percentages
  • 13th for neonatal mortality and infant mortality overall
  • 11th for postneonatal mortality
  • 13th for years of potential life lost (excluding external causes)
  • 12th for life expectancy at 1 year for males, 11th for females
  • 12th for life expectancy at 15 years for males, 10th for females

The most shocking revelation of her report is that iatrogentic damage (defined as a state of ill health or adverse effect resulting from medical treatment) is the third leading cause of death in the U.S., after heart disease and cancer.

Let me pause while you take that in.

This means that doctors and hospitals are responsible for more deaths each year than cerebrovascular disease, chronic respiratory diseases, accidents, diabetes, Alzheimer’s disease and pneumonia.

The combined effect of errors and adverse effects that occur because of iatrogenic damage includes:

  • 12,000 deaths/year from unnecessary surgery
  • 7,000 deaths/year from medication errors in hospitals
  • 20,000 deaths/year from other errors in hospitals
  • 80,000 deaths/year from nosocomial infections in hospitals
  • 106,000 deaths a year from nonerror, adverse effects of medications

This amounts to a total of 225,000 deaths per year from iatrogenic causes. However, Starfield notes three important caveats in her study:

  • Most of the data are derived from studies in hospitalized patients
  • The estimates are for deaths only and do not include adverse effects associated with disability or discomfort
  • The estimates of death due to error are lower than those in the Institute of Medicine Report (a previous report by the Institute of Medicine on the number of iatrogenic deaths in the U.S.)

If these caveats are considered, the deaths due to iatrogenic causes would range from 230,000 to 284,000.

Starfield and her colleagues performed an analysis which took the caveats above into consideration and included adverse effects other than death. Their analysis concluded that between 4% and 18% of consecutive patients experience adverse effects in outpatient settings, with:

  • 116 million extra physician visits
  • 77 million extra prescriptions
  • 17 million emergency department visits
  • 8 million hospitalizations
  • 3 million long-term admissions
  • 199,000 additional deaths
  • $77 billion in extra costs (equivalent to the aggregate cost of care of patients with diabetes

I want to make it clear that I am not condemning physicians in general. In fact, most of the doctors I’ve come into contact with in the course of my life have been competent and genuinely concerned about my welfare. In many ways physicians are just as victimized by the deficiencies of our health-care system as patients and consumers are. With increased patient loads and mandated time limits for patient visits set by HMOs, most doctors are doing the best they can to survive our broken and corrupt health-care system.

The Institute of Medicine’s report (”To Err is Human”) which Starfied and her colleagues analyzed isn’t the only study to expose the failures of the U.S. health-care system. The World Health Organization issued a report in 2000, using different indicators than the IOM report, that ranked the U.S. as 15th among 25 industrialized countries.

As Starfied points out, the “real explanation for relatively poor health in the United States is undoubtedly complex and multifactorial.” Two significant causes of our poor standing is over-reliance on technology and a poorly developed primary care infrastructure. The United States is second only to Japan in the availability of technological procedures such as MRIs and CAT scans. However, this has not translated into a higher standard of care, and in fact may be linked to the “cascade effect” where diagnostic procedures lead to more treatment (which as we have seen can lead to more deaths).

Of the 7 countries in the top of the average health ranking, 5 have strong primary care infrastructures. Evidence indicates that the major benefit of health-care access accrues only when it facilitates receipt of primary care. (Starfield, 1998)

One might think that these sobering analyses of the U.S. health-care system would have lead to a public discussion and debate over how to address the shortcomings. Alas, both medical authorities and the general public alike are mostly unaware of this data, and we are no closer to a safe, accessible and effective health-care system today than we were eight years ago when these reports were published.

Recommended links

  • Is US Health Really the Best in the World?

kids shoes
The Healthy Skeptic reader Jessica wrote in with this topic suggestion:

“I like the “what to feed children” idea. But it has to be food they will actually EAT.”

The question of how to nourish our children so they develop into healthy adults is one of the most important questions we can ask. Tragically, the answers that the medical mainstream has come up with have contributed to unprecedented epidemics of childhood disease and endangered the health and well-being of our children.

The numbers of overweight and obese children worldwide are expected to climb dramatically by 2010, according to a study by Youfa Wang, PhD, MD at the Johns Hopkins Bloomberg School of Public Health. By the end of the decade, 46 percent of children in North and South America are projected to be overweight and 15 percent will be obese. It’s been assumed that U.S. life expectancy would rise indefinitely, but a new data analysis which was published as a special report in the March 17, 2005 issue of New England Journal of Medicine suggests that this trend is about to reverse itself – due to the rapid rise in obesity, especially among children.

Increasing numbers of children are being treated for depression, according to a 2004 study in the British Journal of Medicine. A 1999 report in California from the state’s Department of Developmental Services found that autism had increased by 273 percent from 1987 to 1998. Current estimates for the incidence of autism are as high as 1 in 120. A national review by The Advocacy Institute in 2002 revealed that learning disabilities in children increased by 30 percent from 1990 to 2000.

These studies show that our children are more obese, more depressed, and have more learning disabilities and behavioral problems than ever before. What could be the cause of such a dramatic change?

Although each of these diseases is complex and multifactorial, it is safe to say that diet and nutrition play a significant role in all of them. For example, consider the key nutrients for brain development in children:

Key nutrients for brain development

  • Vitamin A
  • Vitamin D
  • Choline
  • DHA
  • Zinc
  • Tryptophan
  • Cholesterol

Many parents probably know that these nutrients aren’t found in the refined carbohydrates, vegetable oils and sugars which form the bedrock of the standard American diet. Yet many parents may be unaware that even foods widely assumed to be nutritional – including packaged foods commonly described as “organic”, “natural” or “fortified” – are themselves highly processed and stripped of nutritional value, and little better than their “non-organic” alternatives.

So what should we be feeding our children to ensure healthy growth and development? The following “First Steps” recommended by children’s health advocacy group Nourishing Our Children will get you started:

First steps to healthier children

  1. Replace sugar with natural sweeteners like honey and rapadura.
  2. Replace fruit juices with whole, raw milk.
  3. Replace breakfast cereals with non-nitrate bacon, eggs from hens on pasture, whole milk yogurt, homemade kefir, soaked oatmeal or soaked, wholegrain pancakes.
  4. Replace pasteurized dairy products with raw and cultured dairy.
  5. Eliminate all processed soy foods from your household (this includes soy milk, “protein bars” with soy, baked tofu products and all “soy fast food”).
  6. Replace polyunsaturated vegetable oils and trans fats with traditional fats such as butter, olive oil, coconut oil, palm oil, lard, and tallow.
  7. Replace processed, convenience foods (boxed, packaged, prepared and canned food items) with fresh, organic, whole foods
  8. Provide a daily dose of high vitamin cod liver oil (with no synthetic vitamins added)

In contrast to the bland, unsatisfying (and dangerous) low-fat diet recommended by medical authorities, kids naturally love the foods in a nutrient-dense, whole foods diet. However, it is true that if they’ve been on a diet high in sugar and refined carbohydrates for a long time, there will be an adjustment period as they transition away from those highly processed foods.

My suggestion is to take one item on the list above at a time, and be gentle with yourself. It may take a while longer that way to get to where you want to be, but it’s worth the effort! Some of the changes will be more difficult than others. For example, most children (and adults) prefer the taste of saturated fats like butter, cream and whole-fat dairy to low-fat alternatives such as vegetable oil and skim milk – but may not yet have acquired a taste for cod liver oil!

I’ve provided links to some articles below with some helpful ideas on how to encourage even the most finicky eaters to enjoy nutrient-dense foods and some ideas for quick and healthy brown-bag lunch suggestions for parents.

Recommended links

  • Articles on children’s health – Weston A. Price Foundation
  • Feeding Our Children, by Thomas Cowan, M.D.
  • Taking the Icky out of Picky Eaters
  • Foods to Tantalize Toddlers and Preschoolers
  • Packing the Perfect Lunch Box
  • Nourishing Our Children – children’s health advocacy group

teacupHere’s a question I received today from Julie:

I recently learned that green tea and black tea are very high in fluoride because they pull it from the ground… can you say more about this?

As a tea drinker myself, I wish I could tell you that this is a myth. Unfortunately, it’s all too true. And although medical authorities continue to tell us that fluoride is a harmless substance which prevents dental caries and tooth decay, a large body of scientific research says otherwise.

Fluorides are toxins that accumulate in the body over time. That’s why the Surgeon General has established limits for maximum fluoride content in our drinking water which are regulated by the EPA. This limit was set in order to avoid a condition known as Crippling Skeletal Fluorosis (CSF). The limit of four parts per million (ppl) or 4 mg/liter was designed to prevent only the third and most serious stage of CSF, where the extremities become weak and the vertebrae partially fuse together, crippling the patient. Yet studies published by the World Health Organization (1970) have shown that a daily dose of even 2-8 mg/L of fluoride can cause third-stage CSF!

For a more thorough discussion of the dangers of fluoride, I recommend reading this article by Andreas Schuld, the head of an organization called Parents of Fluoride Poisoned Children (PFPC), and visiting the PFPC website. I have also embedded a video interview with Christopher Bryson at the end of this post. Bryson is the author of The Fluoride Deception, a scathing critique of one of the most damaging public health misconceptions of our time.

Schuld mentions several different sources of fluoride, from foods to prescription drugs. The highest source of fluoride in any edible plant, however, is tea leaves. Fluoride content in teas has risen precipitously over the past 20 years due to industry contamination and other environmental factors. In a 2005 study at the Washington University School of Medicine in St. Louis, researchers found that some regular strength preparations contain as much as 6.5 parts per million (ppm) of fluoride, well over the 4 ppm maximum allowed in drinking water by the Environmental Protection Agency and 2.4 ppm permitted in bottled water and beverages by the Food and Drug Administration. The Public Health Service indicates that the fluoride concentration in drinking water should not exceed 1.2 ppm.

More recent studies cited by Schuld in her article have revealed a fluoride content of 17.25 mg per teabag or cup in black tea, and a whopping 22 mg of soluble fluoride ions per teabag or cup in green tea. The longer a tea bag is steeped, the higher the fluoride content will be. In fact, one study demonstrated that the amount of measurable fluoride almost doubles in just ten minutes.

To put this in perspective, drinking one cup of green tea with 22 mg of soluble fluoride ions is equivalent to drinking 22 liters of water that has been fluoridated to the Public Health Service recommended level of 1 ppm.

Tea is the second most widely consumed beverage in the world behind water. And believe it or not, nearly 127 million (almost half) of Americans drink tea. Tea is consumed in far greater amounts in countries like the UK, China and India. The increasing fluoride levels in our environment pose a significant threat to the health and well-being of literally billions of people around the world. We must urge public health officials to stop fluoridating our water and begin to acknowledge the overwhelming amount of scientific data indicating fluoride’s toxicity and negative impact on human health.

Recommended links

  • Fluoride: Worse Than we Thought
  • Fluoridation: The Fraud of the Century
  • Fluoride levels in tea – USA
  • Fluoride levels in food
  • The Fluoride Education Project

The video below is an interview with Christopher Bryson, author of The Fluoride Deception. It is an excellent introduction to the history and dangers of fluoridation.

question markI’d like introduce a new feature here at The Healthy Skeptic. Once a month (or so) I’ll invite you to submit any questions you have or topics you’d like to see addressed on the blog. I’ll keep a list of your requests and add them into the mix of content I have planned for the coming month.

Wondering whether saturated fat and cholesterol actually cause heart disease? Not sure what to feed your kids to support their growth and development? Want to know the truth about fluoride? Confused about the difference between carotenes and Vitamin A? Looking for a good way to cook liver? It’s all fair game!

Please leave a comment below this post with your questions or topic suggestions. And make sure to check the box to “subscribe to comments” if you want to see what others are asking too. I look forward to hearing from you!

gramophoneThere’s no doubt that optimal nutrition plays a significant role in supporting our health and well-being. But nutrition, as important as it is, obviously isn’t the only factor that influences our physiology.

Over the past several years, an increasing amount of research has focused on the role of emotions, behavior and beliefs in contributing to both health and disease. In fact, an entirely new discipline called “psychoneuroimmunology” (say that three times fast!) has emerged to study the connection between the mind and the body. In short, what has been revealed is that the separation we make between “the mind” and “the body” is largely an illusion. Mind and body exist in a continuous and interrelated web of connections that is only now beginning to be discovered by western science.

But though the idea that our thoughts and emotions can directly influence our physiology is new to modern biomedicine (just ten years ago it was dismissed by most physicians and researchers as so much “New Age” fluff), it has been deeply ingrained in our cultural paradigm for centuries. It is embedded in our language; consider the phrases “worried sick” or “scared to death”, and you’ll know what I mean. I’m sure all of you have had the experience of becoming ill after a particularly stressful period at work, or feeling moody and perhaps depressed while you are physically ill. These are both prime examples of how interconnected our mental and emotional health is.

In their book Feeling Good Is Good For You, researchers Carl J. Charnetski and Francis X Brennan set out to review the emerging evidence that pleasure can boost our immune systems and lengthen our lives. According to the authors,:

“In every way, stress is the antithesis of pleasure. It jangles your nerves, juggles a whole host of your body’s hormones, elevates your blood pressure, and makes your pulse race… It also weakens your immune system’s ability to resist illness and disease.”

If stress is the antithesis of pleasure, then it follows that pleasure is the antithesis of stress. And the best way to fight stress, say Charnetski and Brennan, is with pleasure. Our bodies secrete chemicals called endorphins when we experience pleasure. Animal research has revealed, for example, that endorphin levels are up to 86 times higher after animals experience multiple orgasms! But endorphins are also released, albeit at lower levels, in more mundane daily activities such as playing with a pet, watching a funny movie, listening to our favorite music, visiting a favorite place or connecting with loved ones.

The chemicals released when we experience pleasure do more than counteract stress hormones and improve mood. Consider these additional effects:

  • They improve immune function by producing an antibacterial peptide
  • They enhance the killer instincts and abilities of various immune components, including B cells, T cells, NK cells, and immunoglobulins.
  • They enable certain immune cells to secrete their own endorphins as a way of improving their disease-fighting capacity

Charnetski and Brennan examine several “pleasure inducing” experiences that have been scientifically proven to promote health and well-being.

  • Music
  • Touch
  • Pets
  • Humor
  • Positive attitude and insight

Most of us are already aware of the healing power of those things listed above – at least on some level. But in this culture, there is also an overwhelming reliance on medicine, surgery, diet and other physiological interventions to treat disease. Though we may pay lip service to the idea that stress causes illness and pleasure can prevent it, how many of us actually attribute the same importance to listening to music or watching a funny movie as we do to taking a pill? The lesson in this book is that our thoughts, beliefs, emotions and behavior are all capable of inducing the same physiological changes in our bodies as foods, supplements, pills and even surgery are.

If you doubt that this is true, consider the placebo effect. It has been proven over and over again that pharmacologically inert substances like sugar pills can have identical or even greater therapeutic effects than drugs in certain cases. Even more impressive are the trials that have shown that sham surgery (when small incisions are made to convince the patient they have had the operation, but no surgery is performed) is at times as effective as the actual surgery.

Clearly this points to the power we all have to heal ourselves. If only the suggestion or belief that we will heal is enough to induce the physiological changes that lead to healing, without the presence of any “active” pharmacological substance or surgical intervention, then clearly our thoughts, beliefs and emotions have the potential to be powerful medicine.

New research has just been published in the Journal of the American Society of Nephrology that questions the long-held popular belief that drinking eight glasses of water a day benefits our health.

According to Dr. Stanley Goldfarb and Dr. Dan Negoianu of the University of Pennsylvania in Philadelphia, there are four prevalent myths about water intake:

  1. Leads to more toxin excretion
  2. Improves skin tone
  3. Makes one less hungry
  4. Reduces headache frequency

Dr. Goldfarb and Dr. Negoianu reviewed all of the published studies which examined the health benefits of water consumption. They concluded that people in hot, dry climates, athletes or people with certain diseases might do better with increased fluid intake, but for average healthy people, more water did not mean better health.

“There is no clear evidence of benefit from drinking increased amounts of water,” Dr. Goldfarb wrote, but he also added, “There is also no clear evidence of lack of benefit.” In other words, the scientific research doesn’t tell us one way or the other whether there’s a benefit or not.

Fortunately, nature has endowed us with a mechanism that can in fact help us determine how much water we need to be drinking per day. It’s called thirst. If we simply pay attention to our thirst and respond appropriately, it’s likely that we will take in as much water as we need. Four to six glasses per day is probably sufficient for most people; but then again, the evidence indicates there is no harm in drinking more, so if you enjoy drinking a lot of water then knock yourself out!

There is no evidence that increased water consumption helps to excrete toxins. The kidneys perform that function in the body, and as long as they are healthy they do it very well. Dr. Goldfarb: “The kidneys clear toxins. This is what the kidneys do. They do it very effectively. And they do it independently of how much water you take in. when you take in a lot of water, all you do is put out more urine but not more toxins in the urine.”

There is no evidence supporting the other three myths either; namely, that it improves skin tone, reduces hunger and alleviates headaches. But again, if your experience is different and you find that water does help with these conditions – then there is absolutely no reason not to continue what you’re doing now (other than perhaps more frequent trips to the bathroom!) Just don’t go crazy with the water intake, because extremely high levels of water consumption can affect the fluid balance in the body, causing “water intoxication” and even death.

Finally, I’d be remiss if I didn’t emphasize that the quality of the water we drink is much more important than the quantity. My recommendation is that you invest in a high-quality water filter and install it in your home. Avoid bottled water, which is often simply tap water packaged in a plastic bottle that can potentially leach toxins into the water – especially when left in the sun. (You know that “plasticky” smell when you drink water from a plastic bottle that has been around for a while? Not good. Not good at all.) Nalgene bottles should also be avoided as they can leach another unsafe chemical called BPA into your water. Instead, buy a stainless steel water bottle and fill it up with your filtered water at home before you go out.

Also, both tap water and filtered bottled water contain fluoride, a highly toxic bone poison that should be avoided at all costs. Many commercial water filters unfortunately do not remove fluoride, which is present in our water supply because of the gross misconception that it supports dental health. But more on that myth in another article.

Make sure to check out part I of “Why grass-fed is best” for the environmental and ethical benefits of pasture-raised animal products.

In part I we reviewed the environmental and ethical benefits of pasture-raised animal products, along with some general information about why they are more nutritious. In this article, we’ll look more specifically at exactly why grass-fed animal products are superior to commercially-raised alternatives.

Meat

  • Meat from grass-fed animals has two to four times more omega-3 fatty acids than meat from grain- fed animals.
  • When chickens are housed indoors and deprived of greens, their meat and eggs also become artificially low in omega-3s.
  • Eggs from pastured hens can contain as much as 19 times more omega-3s than eggs from factory hens.
  • When ruminants are raised on fresh pasture alone, their products contain from three to five times more CLA than products from animals fed conventional diets. CLA is a fatty acid that has recently been studied as a potent cancer fighter.
  • The meat from the pastured cattle is four times higher in vitamin E than the meat from the feedlot cattle and, interestingly, almost twice as high as the meat from the feedlot cattle given vitamin E supplements.

Milk

  • Unfortunately, 85 to 95 percent of the cows in the United States are now being raised in confinement, not on pasture. The only grass they eat comes in the form of hay, and the ground that they stand on is a blend of dirt and manure.
  • Milk from a pastured cow can have five times as much CLA as a grainfed animal.
  • Milk from pastured cows also contains an ideal ratio of essential fatty acids or EFAs. Studies suggest that if your diet contains roughly equal amounts of these two fats, you will have a lower risk of cancer, cardiovascular disease, autoimmune disorders, allergies, obesity, diabetes, dementia, and various other mental disorders.
  • When a cow is raised on pasture , her milk has an ideal ratio of omega-6 to omega-3 fatty acids. Replace two-thirds of the pasture with a grain-based diet and the milk will have more than five times the amount of omega-6 fatty acids than omega-3s, a ratio that has been linked with an increased risk of a wide variety of conditions, including obesity, diabetes, depression, and cancer.
  • Grassfed milk is higher in beta-carotene, vitamin A, and vitamin E. This vitamin bonus comes, in part, from the fact that fresh pasture has more of these nutrients than grain or hay. These extra helpings of vitamins are then transferred to the cow’s milk.

Free-range (pastured) eggs

  • When compared to commercially raised, supermarket eggs, free-range eggs have:
    2/3 more vitamin A
  • 7 times more beta carotene
  • Up to 19 times more omega-3 fatty acids
  • Significantly more folic acid and vitamin B12

Raw dairy products – another step up

The information above should convince you that grass-fed dairy products are superior in every way to dairy products that come from grain-fed cows. Another important distinction to be made is the difference between raw and pasteurized dairy products.

I will be covering this in further detail in a future article, but in short raw dairy products have several significant advantages over pasteurized alternatives:

  • Raw milk is an outstanding source of nutrients including beneficial bacteria such as lactobacillus acidolphilus, vitamins and enzymes, as well as the finest source of calcium available.
  • Pasteurizing milk destroys enzymes, diminishes vitamin, denatures fragile milk proteins, destroys vitamin B12, and vitamin B6, kills beneficial bacteria and promotes pathogens.
  • Raw milk is not associated with any the problems of pasteurized milk, and even people who have been allergic to pasteurized milk for many years can typically tolerate and even thrive on raw milk.

Contrary to popular belief, raw milk is safe to consume. There has never been a pathogen found in the milk of the two largest raw dairy producers in California, Organic Pastures and Claravale. In fact, the USDA has been unable to even find pathogens in the soil at Organic Pastures – which is highly unusual. This is due to the much more stringent standards for sanitation that raw dairies must comply with in order to be licensed to sell their products.

Again, I will cover this in more detail in a future article. Stay tuned!

Although most consumers have heard of grass-fed or pasture-raised animal products, confusion still abounds about what their benefits are and why we should choose them over commercially-raised animal products.

It is important to note that the “organic” label does not have anything to do with whether an animal product is pasture-raised or not. It’s possible, and indeed common, for an organic meat or dairy product to come from cows raised in confinement feedlots. Likewise, it is also common to encounter pasture-raised animal products that do not have the “organic” label. This often occurs when the farm raising the animals is too small to afford the expensive organic certification process. In these cases, if one knows the farmer and his or her practices, it is preferable to choose the non-organic, grass-fed source over the organic, commercially-raised alternative.

Many environmental and ethical objections to eating meat stem from the tremendously destructive and cruel practices of commercial feedlot meat production. When meat and dairy animals are raised in a humane and ecologically responsible manner, these objections (which I entirely agree with in the case of commercial production) are no longer defendable.

In this two-part article I will cover the benefits of pasture-raised animal products. In part I, we’ll examine the environmental and economic benefits, and in part II, we’ll look at the nutritional and health benefits. Information is adapted in part from the Eat Wildwebsite.

Back to the pasture
Pasture-raised animals live on the range where they forage on their native diet. They are not sent to feedlots to be fattened on corn, soy or other grains which they do not normally eat. Pasture-raised livestock are not treated hormones or feed them growth-promoting additives. As a result, the animals grow at a natural pace. For these reasons and more, grass-fed animals live low-stress lives and are so healthy there is no reason to treat them with antibiotics or other drugs.

More Nutritious
A major benefit of raising animals on pasture is that their products are healthier for you. For example, compared with feedlot meat, meat from grass-fed beef, bison, lamb and goats has two to four times more omega-3 fatty acids. Meat and dairy products from grass-fed ruminants are the richest known source of another type of good fat called “conjugated linoleic acid” or CLA. When ruminants are raised on fresh pasture alone, their products contain from three to five times more CLA than products from animals fed conventional diets. Grass-fed meat also has more vitamin E, beta-carotene and vitamin C than grain-fed meat.

Factory Farming
Raising animals on pasture is dramatically different from the status quo. Virtually all the meat, eggs, and dairy products that you find in the supermarket come from animals raised in confinement in large facilities called CAFOs or “Confined Animal Feeding Operations.”  These highly mechanized operations provide a year-round supply of food at a reasonable price. Although the food is cheap and convenient, there is growing recognition that factory farming creates a host of problems, including:

  • Animal stress and abuse
  • Air, land, and water pollution
  • The unnecessary use of hormones, antibiotics, and other drugs
  • Low-paid, stressful farm work
  • The loss of small family farms
  • Food with less nutritional value

Unnatural Diets
Animals raised in factory farms are given diets designed to boost their productivity and lower costs. The main ingredients are genetically modified grain and soy that are kept at artificially low prices by government subsidies. To further cut costs, the feed may also contain “by-product feedstuff” such as municipal garbage, stale pastry, chicken feathers, and candy. Until 1997, U.S. cattle were also being fed meat that had been trimmed from other cattle, in effect turning herbivores into carnivores. This unnatural practice is believed to be the underlying cause of BSE or “mad cow disease.”

Environmental Degradation
When animals are raised in feedlots or cages, they deposit large amounts of manure in a small amount of space. The manure must be collected and transported away from the area, an expensive proposition. To cut costs, it is dumped as close to the feedlot as possible. As a result, the surrounding soil is overloaded with nutrients, which can cause ground and water pollution. When animals are raised outdoors on pasture, their manure is spread over a wide area of land, making it a welcome source of organic fertilizer, not a “waste management problem.”

Make sure to see part II for the nutritional and health benefits of pasture-raised animal products</>

Exposure to sunlight prevents melanoma.

Yes, you did read that correctly.

Two independent studies published in the Feb. 2005 issue of the prestigious Journal of the National Cancer Institute (JNCI) squarely contradict the popular myth that UV light causes melanoma.

The first study evaluated the hypothesis that UV radiation increases your risk of developing lymphoma – a hypothesis that had become widely accepted in the 1990s and early 2000s. After studying nearly 7,000 subjects, the authors concluded that the opposite is actually true: increased sun exposure reduces the risk of non-Hodgkin’s lymphoma (NHL) by up to 40%. What’s more, the reduction in risk was dose-related, which means that the more sun exposure someone got, the lower their risk of cancer was.

The second study looked at the link between sun exposure and the chances of surviving melanoma, which is the deadliest form of skin cancer. Guess what? The researchers concluded that increased sun exposure decreases the chance of dying from skin cancer by approximately 50%.

At this point you might be scratching your head and wondering how this could possibly be true, in light of what we’ve been told all these years about the relationship between sunlight and skin cancer. Let’s take a closer look at what explains this phenomenon, and why you likely haven’t heard about it on the news.

Clarification

An editorial published in the same issue of JNCI begins with this statement:

“Solar radiation is a well-established skin carcinogen, responsible for more cancers worldwide than any other single agent.”

This is true. But what the authors neglect to mention is that the type of cancer they are referring to is not melanoma but other types of cancer. Melanoma is the most serious form of skin cancer because it is malignant and can metastasize (spread) to other areas of the body, often leading to death.

But 90 percent of skin cancers are not melanomas. Rather, the most common forms are basal and squamous cell carcinomas, which are often benign and easily cured by simple outpatient surgery. These non-malignant forms of skin cancer are indeed caused by solar radiation (at least according to current research). Melanomas, however, are most likely caused by lack of sunlight or excess exposure to artificial light!

The editorial mentioned two other very important facts that you aren’t likely to hear about from mainstream media sources: that melanoma is normally found in areas of the body that are not typically exposed to sunlight at all (use your imagination), and that vitamin D may be important in preventing melanoma.

Here’s what they actually had to say:

“Evidence is beginning to emerge that sunlight exposure, particularly as it relates to vitamin D synthesized in the skin under the influence of solar radiation, might have a beneficial influence for certain cancers.”

Umm, like, we already knew that.

The role of Vitamin D

It has been known for several years that sun exposure might have a beneficial effect on certain cancers. A 1999 publication of the National Institute of Health (NIH) entitled Atlas of Cancer Mortality in the United States revealed that among caucasians in the United States, cancer mortality for several prominent cancers, including cancer of the breast, prostate and colon, shows a striking latitudinal gradient. Specifically, people living in northern states have much higher rates of these cancers than those residing in the southern states.

The reason for this? Northern states get a whole lot less sunshine than southern states.

As early as 1990 it was proposed that vitamin D, which is synthesized in the skin upon exposure to UV light, might be the agent that accounts for these geographical patterns. (Garland et al. 1990) Less exposure to sunshine means less production of vitamin D. It is known that calcitriol, the active form of vitamin D3, has multiple cellular affects that could confer protection against cancer. The ability to convert the precursor to vitamin D to the active form of D3 (calcitriol) is greatly reduced at northern latitudes, and populations living far from the equator are at increased risk of vitamin D deficiency during the winter months. (Tangpricha et al. 2002)

Even more significant may be the observation that patients with malignant melanoma exhibit low levels of vitamin D3 in their blood, and that others have a problem with the receptor for vitamin D. (Hutchinson et al. 2000; Green et al. 1983) The incidence of melanoma of the skin on sites of the body intermittently exposed to sunlight is reduced among outdoor workers compared with indoor workers. (Elwood et al. 1985)

All of this points to a protective role for vitamin D against cancer in general, and melanoma in particular. But the final nail in the coffin of the “sunlight causes melanoma” hypothesis is this:

A comprehensive review of research studies from 1966 through 2003 failed to show any association between melanoma and sunscreen use! (Dennis et al. 2003)

Say what? Sunscreen doesn’t prevent skin cancer, that’s what.

Does sunscreen contribute to skin cancer?

One thing sunlight does cause is an injury to the inner layer of the skin (called the “dermis”), which leads to a wrinkling of the outer layer (called the “epidermis”). This phenomenon, which happens naturally with age but is accelerated by sun exposure, is called “solar elastosis”, or SE.

Sounds like a bad thing, right? But when researchers at the University of New Mexico studied melanoma, they found a marked decrease in the disease in patients with SE. (Berwick et al. 2005). To put it simply: more sun exposure equals lower risk of melanoma. For patients who already had melanoma, the subsequent death rate from the disease was approximately one-half as high in the group of patients with signs of SE.

I’ll give you a minute to finish cursing the “medical authorities” that have been admonishing us to slather ourselves and our children with sunscreen for decades in order to “prevent skin cancer”. As it turns out, if we followed this advice (and why wouldn’t we have? It sounded logical…) we have actually increased our chances and our children’s chances of developing not just skin cancer, but other cancers as well.

I’m sorry to scare you like that, but I feel I must in order to make this point as clearly as I can:

Exposure to sunlight decreases your risk of cancer, and using sunscreen increases your risk of cancer.

As we have already discussed, sunlight is a major source of vitamin D. Insufficient levels of vitamin D can result in osteoporosis, autoimmune diseases and rheumatoid arthritis – among other equally unpleasant and life-threatening conditions. When you put on those high-SPF sunscreens, not only are you increasing your risk for melanoma, you are increasing your risk of developing all of the conditions that can arise from vitamin D deficiency because you are blocking your body’s ability to synthesize vitamin D.

And while it is possible to obtain vitamin D from food, it is only present in large amounts in certain kinds of seafood – which many people do not consume regularly. The highest sources for vitamin D in food are anglerfish liver, cow’s blood (I’m not joking) and high-vitamin cod liver oil (HVCLO). It is also present in more modest amounts in chum salmon, Pacific marlin, herring, bluefin tuna, duck eggs, trout, eel, mackerel and salmon.

I’m going to go out on a limb and guess that most Americans aren’t eating these foods on a regular basis. The lack of adequate intake of vitamin D in the diet, combined with habitual use of high-SPF sunscreen and/or lack of exposure to the sun is a perfect recipe for increasing the risk of cancer for children and adults alike.

But you will not hear the sunscreen manufacturers telling you to stop using their product, and you probably won’t hear it from dermatologists in the field who have a reputation (and a history of telling people to wear sunscreen) to protect. They’ll tell you that sunburn is an important factor in melanoma formation since that’s really all they have left in terms of support for selling sunscreen. What they neglect to mention is that 1) millions of people get sunburned every year but very few develop melanoma, and more importantly, 2) if melanoma does appear, it’s most likely to appear in areas not exposed to the sun.

Nevertheless, it’s still probably a good idea to avoid getting sunburned – especially on a regular basis. But it is not a good idea to wear sunscreen, nor is it a good idea to avoid sun exposure.

The idea that sunlight causes cancer and sunscreen prevents it is another mainstream myth that has no support in the scientific literature. Just like the idea that cholesterol causes heart disease, eating fat makes you fat, and fluoride is good for your teeth. (If you still believe any of those statements, check my archives and sign up for my free email digest!)

Before closing, I must mention (briefly) the issue of vitamin D toxicity. Vitamin D is widely considered to be the most toxic of all vitamins, and dire warnings are often issued to avoid excess sun exposure and vitamin D in the diet on that basis. The discussion of vitamin D toxicity has failed to take into account the interaction between vitamins A, D and K. Several lines of evidence suggest that vitamin D toxicity actually results from a relative deficiency of vitamins A and K.
So, the solution is not to avoid sun exposure or sources of vitamin D in the diet. Rather, it ensure adequate vitamin D intake (through sunlight and food) and to increase the intake (through diet and/or supplements) of vitamins A & K. Stay tuned for a future post on the interaction between vitamins A, D & K and their relevance to human health.

In the meantime, this is what I recommend for protecting against cancer and both deficiency and toxicity of vitamin D:

THS recommendations:

  • Throw away your sunscreen. It contributes to cancer.
  • Get an hour or two of exposure to sunlight each day if possible. Don’t cover your skin (or your child’s skin) completely when out in the sun.
  • Avoid frequent sunburn
  • In northern latitudes or during winter months when the sun isn’t shining, take 1 tsp./day of high-vitamin cod liver oil (Green Pasture or Radiant Life are two brands I recommend) to ensure adequate vitamin A & D intake. You can also eat vitamin D-rich foods such as herring, duck eggs, bluefin tuna, trout, eel, mackerel, sardines, chicken eggs, beef liver and pork.
  • Make sure to eat enough vitamin K. Primary sources in the diet are natto, hard and soft cheeses, egg yolks, sauerkraut, butter and other fermented foods. Make sure to choose dairy products from grass-fed animals if possible.

As always, leave a comment or contact me with questions!

Most health-conscious folks have heard of essential fatty acids (EFAs) by now. It isn’t unusual for a health food store to sell several different brands of fish oils, flax oil and other blends of “essential fatty acids”. We’ve been told that consuming these oils will keep us healthy and protect us from disease.

Today’s nutrition textbooks refer to omega-6 (linoleic) acid and omega-3 (alpha-linolenic) acid as essential components of the human diet, and cite the requirement as something between one and four percent of total caloric intake. When scientists say a nutrient is “essential”, they mean it cannot be synthesized within our bodies from other components by any known mechanism – and therefore must be obtained from the diet.

But are “essential fatty acids” truly essential?

Chris Masterjohn, a PhD candidate in Nutritional Science at the University of Connecticut, has just published a paper which directly challenges the belief that omega-6 linoleic acid and omega-3 alpha-linolenic acid are essential.

His review of the scientific research suggests that omega-6 arachidonic acid (AA) and the omega-3 docosahexaenoic acid (DHA) are the only fatty acids that are truly essential – and thus necessary in the diet – for humans. Further, the true requirement for EFA during growth and development (during childhood, pregnancy or recovery from injury and illness) is less than one-half of one percent of calories when supplied by most animal fats, and even less (0.12 percent) when supplied by liver. In healthy adults, the requirement is “infinitesimal if it exists at all.”

So why is this a concern? Excess consumption of linoleate (omega-6 fatty acid) from vegetable oil will interfere with the production of DHA , while an excess of EPA from fish oil will interfere with the production and utilization of AA. So, by consuming an abundance of the oils which are today heavily promoted as “essential” – vegetable oil and fish oil – we are actually reducing the amount of the fatty acids that are truly essential – DHA & AA.

Finally, it must be pointed out that EFAs of all types, even the health promoting DHA & AA, are polyunsaturated fatty acids (PUFAs). PUFAs are widely known to contribute to oxidative stress, and oxidative stress directly contributes to many diseases including cancer and heart disease. This is why it is important to restrict our intake of EFAs to as close to the minimum requirement as possible.
Most people are far above this requirement, since vegetable oil is pervasive in the American diet. It’s in just about all processed foods (even the “healthy” ones), fried foods and everything cooked in a restaurant. And many people cook with it at home, without knowing what the dangers are.

The best sources of EFA in the diet are liver, egg yolk and butter from grass-fed animals. Obtaining these foods from pasture-raised animals is important, as they contain significantly higher concentrations of DHA and AA (the truly essential EFAs) and fat-soluble vitamins than their commercial feedlot counterparts.

THS recommendations:

  • Gradually replace all vegetable oils in your diet with healthy traditional fats (which are protected from oxidative stress) such as butter, virgin (unrefined) coconut oil, palm oil, lard and beef tallow.
  • Eliminate (or at least dramatically reduce) consumption of processed and fried foods.
  • Do not take flax oil or fish oil supplements on a regular basis. Cod liver oil is recommended during pregnancy, lactation and childhood to provide extra DHA and to obtain fat-soluble vitamins.

Following these recommendations, along with a nutrient-dense, whole foods based diet low in sugar and rich in essential minerals, should reduce your intake of PUFA to closer to the recommended 0.5 (one-half of one) percent of calories, and ensure adequate intake of the truly essential DHA & AA.

Women who are pregnant or lactating, and perhaps attempting to become pregnant, children, and adults recovering from injury and suffering from chronic, degenerative disease can safely consume up to one percent of calories as PUFA. Studies have suggested that a subset of patients with pre-existing cardiovascular disease also benefit from a moderate dose of fish oil (up to one gram per day); however, in those same studies people with stable angina and with no heart disease at all, fish oil actually increased their risk of heart attack.

Check back here for a future post on what the research has to say about using omega-3 fatty acids (fish oil) in the treatment of heart disease.

Make sure to visit Chris Masterjohn’s website, where you can purchase the excellent full report for $15. It’s a worthwhile investment, in my opinion, if you want to get the straight scoop about EFAs and their role in our diet.

(Excerpted from the Weston A. Price foundation Journal: Caustic Commentary – Summer 2007)

Full-fat milk has pretty much disappeared from the public schools—not just in the US, but also in New Zealand, Australia and the UK. In most schools, children have a choice of watery reduced-fat milk or sugar-laden chocolate milk, based on the misconception that the butterfat in whole milk will cause heart disease later in life. So it’s a bit embarrassing when a study comes along showing that whole-fat milk products may help women conceive. Over a period of eight years, Jorge E Chavarro of the Harvard School of Public Health in Boston assessed the diets of 18,555 married women without a history of infertility who attempted to get pregnant or became pregnant. During the study, 2165 women were examined medically for infertility and 438 were found to be infertile due to lack of ovulation. The researchers found that women who ate two or more servings of lowfat dairy foods per day, particularly skim milk and yogurt, increased their risk of ovulation-related infertility by more than 85 percent compared with women who ate less than one serving of lowfat dairy food per week (Human Reproduction, online February 28, 2007). Chavarro advises women wanting to conceive to consume high-fat dairy foods like whole milk and ice cream, “while at the same time maintaining their normal calorie intake and limiting their overall intake of saturated fats in order to maintain good general health.” Once a woman becomes pregnant, says Chavarro, “she should probably switch back to lowfat dairy foods.” No one has looked at the effect on fertility of lowfat dairy for the developing fetus and for growing school children. Odds are that infertility due to life-long fat starvation will not be so easily reversed by a temporary return to high-fat dairy foods.

THS COMMENTARY:

This is a perfect example of how mainstream dogma gets in the way of clear thinking. The study unambiguously showed the superiority of whole fat milk products for helping a woman to become pregnant. Yet the author of the study advises women to “switch back to low fat dairy foods” once she becomes pregnant! So, according to this twisted logic, the nutrients in whole fat milk that helped the woman to conceive in the first place will somehow suddenly be harmful to her and her fetus during pregnancy? Isn’t it far more reasonable to assume that those same nutrients that increased the women’s fertility will also support the growth and development of the fetus? In fact, there is plenty of research that supports this common-sense view (stay tuned for a future post on this.)

And Fatter?

Will lowfat milk served in schools not only make our children infertile, but also fatter? That’s the conclusion from a 2006 Swedish study which looked at 230 families in Goteborg, Sweden. Almost all of the children were breastfed until five months and 85 percent had parents who were university educated. Seventeen percent were classified as overweight, and a higher body mass index (BMI) was associated with a lower fat intake—and those on lower fat diets consumed more sugar. A lower fat intake was also associated with high insulin resistance (www.ub.gu.se/sok/dissdatabas/detaljvy.xml?id=6979).

Whole Fat Milk, Lower Weight Gain

In yet another defeat for the lowfat, you-must-suffer-to-lose-weight school of thought, a Swedish study has found that women who regularly consume at least one serving of full-fat dairy every day gained about 30 percent less weight than women who didn’t. The researchers, from the Karolinska Institute in Stockholm, looked at the intake of whole, sour, medium- and lowfat milk, as well as cheese and butter for 19,352 Swedish women aged 40-55 years at the start of the study. The researchers report that a regular and constant intake of whole milk, sour milk and cheese was significantly and inversely associated with weight gain (that is, those consuming whole-milk products did not gain weight), while the other intake groups were not. A constant intake of at least one daily serving of whole and sour milk was associated with 15 percent less weight gain, while cheese was associated with 30 percent less weight gain (American Journal of Clinical Nutrition, 2007;84(6):1481-1488). This wonderful scientific news has not inspired WebMD to remove their guidelines to eating “fabulous foreign foods.” The trick, they say, is to avoid dishes made with coconut milk in Thai restaurants; ghee, beef and lamb in Indian restaurants; and cream soups, cream sauces, béarnaise, creamy dressings, pâté, fatty meats, duck and sausages in French restaurants (onhealth.webmd.com). In other words, enjoy your meal out but not too much.

Conventional dietary wisdom holds that the micronutrients (vitamins, minerals and trace elements) we need from foods are most highly concentrated in fruits and vegetables. While it’s true that fresh fruits and veggies are full of vitamins and minerals, their micronutrient content pales in comparison to what is found in meats and organ meats – especially liver.

The chart below lists the micronutrient content of apples, carrots, red meat and beef liver. Note that every nutrient in red meat except for vitamin C surpasses those in apples and carrots, and every nutrient—including vitamin C—in beef liver occurs in exceedingly higher levels in beef liver compared to apple and carrots. In general, organ meats are between 10 and 100 times higher in nutrients than corresponding muscle meats.

In fact, you might be surprised to learn that in some traditional cultures, only the organ meats were consumed. The lean muscle meats, which are what we mostly eat in the U.S. today, were discarded or perhaps given to the dogs.

A popular objection to eating liver is the belief that the liver is a storage organ for toxins in the body. While it is true that one of the liver’s role is to neutralize toxins (such as drugs, chemical agents and poisons), it does not store these toxins. Toxins the body cannot eliminate are likely to accumulate in the body’s fatty tissues and nervous systems. On the other hand, the liver is a is a storage organ for many important nutrients (vitamins A, D, E, K, B12 and folic acid, and minerals such as copper and iron). These nutrients provide the body with some of the tools it needs to get rid of toxins.

Remember that it is essential to eat meat and organ meats from animals that have been raised on fresh pasture without hormones, antibiotics or commercial feed. Pasture-raised animal products are much higher in nutrients than animal products that come from commercial feedlots. For example, meat from pasture-raised animals has 2-4 times more omega-3 fatty acids than meat from commercially-raised animals. And pasture-raised eggs have been shown to contain up to 19 times more omega-3 fatty acids than supermarket eggs! In addition to these nutritional advantages, pasture-raised animal products benefit farmers, local communities and the environment.

For more information on the incredible nutritional benefits of liver and some suggestions for how to prepare it, click here.

APPLE (100 g) CARROTS (100 g) RED MEAT (100 g) BEEF LIVER (100 g)
Calcium 3.0 mg 3.3 mg 11.0 mg 11.0 mg
Phosphorus 6.0 mg 31.0 mg 140.0 mg 476.0 mg
Magnesium 4.8 mg 6.2 mg 15.0 mg 18.0 mg
Potassium 139.0 mg 222.0 mg 370.0 mg 380.0 mg
Iron .1 mg .6 mg 3.3 mg 8.8 mg
Zinc .05 mg .3 mg 4.4 mg 4.0 mg
Copper .04 mg .08 mg .18 mg 12.0 mg
Vitamin A None None 40 IU 53,400 IU
Vitamin D None None Trace 19 IU
Vitamin E .37 mg .11 mg 1.7 mg .63 mg
Vitamin C 7.0 mg 6.0 mg None 27.0 mg
Thiamin .03 mg .05 mg .05 mg .26 mg
Riboflavin .02 mg .05 mg .20 mg 4.19 mg
Niacin .10 mg .60 mg 4.0 mg 16.5 mg
Pantothenic Acid .11 mg .19 mg .42 mg 8.8 mg
Vitamin B6 .03 mg .10 mg .07 mg .73 mg
Folic Acid 8.0 mcg 24.0 mcg 4.0 mcg 145.0 mcg
Biotin None .42 mcg 2.08 mcg 96.0 mcg
Vitamin B12 None None 1.84 mcg 111.3 mcg

(Excerpted from the Weston A. Price Journal – “Caustic Commentary”, Fall 2004)

The Top Fourteen

According to government and media health pundits, the top best 14 foods are:

  1. Beans
  2. Blueberries
  3. Broccoli
  4. Oats
  5. Oranges
  6. Pumpkin
  7. Salmon
  8. Soy
  9. Spinach
  10. Tea (green or black)
  11. Tomatoes
  12. Turkey
  13. Walnuts
  14. Yoghurt

This uninspiring list reflects the current establishment angels (anti-oxidants and omega-3 fatty acids) and demons (saturated fats and animal foods).

Our list of the 14 best top foods, foods that supply vital nutrients including the fat-soluble vitamins, looks like this:

  1. Butter from grass-fed cows (preferably raw)
  2. Oysters
  3. Liver from grass-fed animals
  4. Eggs from grass-fed hens
  5. Cod liver oil
  6. Fish eggs
  7. Whole raw milk from grass-fed cows
  8. Bone broth
  9. Wild salmon
  10. Whole yoghurt or kefir
  11. Beef from grass-fed steers
  12. Sauerkraut
  13. Organic Beets

EDIT: If you noticed there are only 13 foods on the list, that’s because I recently removed shrimp due to increasing mercury levels. Thanks to one of my readers for pointing this out.
A diet containing only these foods will confer lifelong good health; a diet containing only the foods in the first list is the fast track to nutritional deficiencies.

A recent article reported on the results of a trial of the cholesterol-lowering drug Zytorin, which is a combination of Zocor and Zeita – made by Merck and Schering-Plough.

Zocor and Zeita lower cholesterol by different mechanisms, so the idea was that combining them into a single drug (Vytorin) would dramatically lower cholesterol and, they assumed, reduce heart disease.

They got the first part right. Vytorin did indeed lead to dramatic reductions in cholesterol levels in those who took the drug. However, it also increased the risk of heart disease – exactly the opposite result they were hoping for.

The worst part about this is that Merck & Schering-Plough sat on this data for almost two years, while over five million people around the world continued to take a drug that was proven to nearly double the risk of heart disease. Congress has launched a full-scale investigation and the NY Times is publicly demanding a new law to prevent this from happening again.

Yesterday another article was published in the Times with an update on the investigation, including emails sent by the lead investigator on the Vytorin trial indicating that Merck & Schering-Plough were deliberately delaying publication of the results of this trial.

Yet another case of gross malfeasance by the pharmaceutical industry. Consumers beware.

Related articles

  • Accusations of Delays in Releasing Drug Results
  • Doubt Cast on Two Drugs Used to Lower Cholesterol
  • Editorial: Overpromoted Cholesterol Drugs

Bad Behavior has blocked 1416 access attempts in the last 7 days.