Cancer

You are currently browsing the archive for the Cancer category.

roast beefYou might have seen an article in your newspaper or online touting a recent study published in the Archives of Internal Medicine that “strongly” linked red meat consumption with cancer and an increased risk of death. Heck, how could you miss it? Google shows 547 new articles about the study, and it was mentioned in just about every major newspaper in the U.S.

(That’s not an accident, by the way. It’s an intentional attack by the tyrannical meat-hating scientific majority, the same folks who brought us the “cholesterol causes heart disease” and “saturated fat is bad for you” myths.)

Trouble is - as is so often the case - the study is deeply flawed. In fact, anyone with training in research methodology might find themselves wondering “where’s the beef?” after they read it. In the end it’s just another piece of worthless propaganda parading as medical research. It tells us a lot more about the biases and motives of the researchers, and the incompetence of the media reporting on it, than it does about the effect of red meat consumption on human health.

Here are my “top 10″ reasons to ignore this study and continue to eat your grass-fed, organic red meat:

  1. It was an observational study. Observational studies can show an association between two variables (i.e red meat consumption and death), but they can never show causation (i.e. that eating red meat caused the deaths). A simple example of the difference between correlation and causation is that elevated white blood cell count is correlated with infections. But that doesn’t mean elevated white blood cell counts cause infections!
  2. The relative risk reduction (RRR) was slightly over 1.0. Most researchers don’t pay attention to an RRR under 2.0, due to the notorious difficulties involved with this type of research.
  3. Two articles were published in the American Journal of Clinical Nutrition at around the same time that directly contradicted these results. The first study pooled data from 13 studies and found that risk of colorectal cancer was not associated with saturated fat or red meat intake. The second study found that there was no difference in mortality between vegetarians and meat eaters.
  4. The authors didn’t adequately control for other dietary factors known to increase morbidity and mortality. As another commentator pointed out in her analysis of this study, “Americans get their “cancer causing” red meat served to them on a great big white bun with a load of other carbohydrates (soda, chips, fries) and inflammation-causing n-6 vegetable oils (chips, fries, salad dressings) on the side.” It’s more likely (based on other studies, including the two mentioned above) that the increase in deaths was caused by the junk food surrounding the red meat and not by the meat itself.
  5. The basis of measurement is a “detailed questionnaire”. Questionnaires about one’s diet are always error prone as remarkably few people remember accurately what they eat on any given day, let alone over a period of years. Furthermore, most people lie about what they actually eat, especially now that proper diet has been given a quasi-religious significance and eating poorly is equated with being morally inferior.
  6. Check out this quote from the Archives of Internal Medicine study:

    “Red meat intake was calculated using the frequency of consumption and portion size information of all types of beef and pork and included bacon, beef, cold cuts, ham, hamburger, hotdogs, liver, pork, sausage, steak, and meats in foods such as pizza, chili, lasagna, and stew”.

    In other words, even those people who ate things like hot dogs and hamburgers (with buns made of refined white flour), and who ate pizza (on refined white flour crusts) were included in the ‘red meat’ group. Also, those who ate processed or cured meats, such as ham, bacon, sausage, hot dogs, or cold cuts (with possible nitrates) were included in the ‘red meat’ group. And those who ate prepared food (with unknown additives and preservatives) such as pizza, chili, lasagna, and stew were also included in the ‘red meat’ group. Therefore, this study does absolutely nothing to prove that red meat, and not these processed and highly refined foods, is the culprit.

  7. The quality of the meat consumed in the study was not taken into account. Highly processed and adulterated “factory-farmed” meats like salami and hot dogs are lumped together with grass-fed, organic meat as if they’re the same thing. It’s likely that very little of the meat people ate in the study was from pasture-fed animals. Factory fed animals are fed corn (high in polyunsaturated, omega-6 fat), antibiotics, and hormones, all of which negatively impact human health.
  8. We don’t know anything about the lifestyles of the different study groups. Were they under stress? Did they lose their jobs? Did they have other illnesses? Did they live in a toxic environment? All of these factors contribute significantly to disease and mortality.
  9. We don’t know if the people in the study ate more sugar, processed food, artificial sweeteners, preservatives, additives or fast food - all of which are known to cause health problems.
  10. We don’t know if the people who ate more red meat were better off financially than the people who ate less red meat, and thus had more exposure to the “medical industrial complex” - which, as you know from my previous article, kills more than 225,000 people per year and is the 3rd leading cause of death in this country.

I could go on, but I think you get the idea. Nothing to see here, folks. Move along.

Me? I’m gonna go have a big, juicy, grass-fed steak.

Further recommended reading

  1. Meat and Mortality. A great critique of the study by Dr. Michael Eades, author of Protein Power.
  2. More on Meat & Sustainability. A Challenge to Environmentalists.
  3. The Red Scare. Another insightful analysis over at Mark’s Daily Apple.

tropical paradiseIn the last two weeks alone three articles have appeared in the scientific press about new studies reporting on vitamin D’s many crucial roles in the body. Along with promoting strong bones, a healthy immune system and protection against some types of cancer, recent studies suggest vitamin D can treat heart failure, protect against heart attacks and reduce the risk of death from both cardiovascular and overall causes.

Back in April I wrote an article called “Throw Away Your Sunscreen” about the protective effects of exposure to sunlight against melanoma. Despite conventional wisdom that tells us to avoid sun exposure at all costs, it turns out that the vitamin D our bodies synthesize when exposed to UV light is a first line of defense against developing melanoma.

In an article published on June 9 in Archives of Internal Medicine, scientists reported that low levels of vitamin D are associated with a higher risk of myocardial infarction (heart attack) in men. The study showed that rates of cardiovascular disease-related deaths are increased at higher latitudes and during the winter months, and are lower at lower altitudes.

In an article published in the July issue of the Journal of Cardiovascular Pharmacology, on June 12, researchers found that vitamin D directly contributes to cardiovascular fitness. In fact, University of Michigan pharmacologist Robert U. Simpson, Ph.D. thinks it’s apt to call vitamin D “the heart tranquilizer”. Simpson and his team discovered that treatments with activated vitamin D prevented heart muscle cells from hypertrophy, a condition in which the heart becomes enlarged and overworked in people with heart failure.

Finally, in a study published on June 23 in the Archives of Internal Medicine, a team of Austrian scientists revealed that low blood levels of vitamin D appear to have an increased risk of death overall and from cardiovascular causes. Harald Donbig, M.D. and his colleagues studied 25-hydroxyvitamin D and 1,25 dihydroxyvitamin D levels in 3,258 consecutive patients (average age 62 years) who were scheduled for coronary angiography testing at a single medical center between 1997 and 2000.

During 7.7 years of follow-up, death rates from any cause and from cardiovascular causes were higher among individuals in the lower one-half of 25-hydroxyvitamin D levels and the lowest one-fourth of 1,25-dihydroxyvitamin D levels. These associations remained when researchers controlled for other factors such as coronary artery disease, physical activity and co-occurring diseases.

So what does all this mean to you? A recent consensus panel estimated that about 50 - 60 percent of older individuals in North America and the rest of the world do not have satisfactory vitamin D status, and the situation is similar for younger individuals. Blood levels of vitamin D lower than 20 to 30 nanograms per milliliter have been associated with falls, fractures, cancer, autoimmune dysfunction, cardiovascular disease and hypertension.

To put it blankly, that means half of all people around the world are deficient in vitamin D and therefore at increased risk for serious and potentially fatal conditions.

Low 25-hydroxyvitamin D levels are also correlated with markers of inflammation such as C-reactive protein, as well as signs of oxidative damage to cells, Donbig’s study revealed. In a previous article, I explained that inflammation and oxidative damage (not cholesterol) are the primary causes of the worldwide heart disease epidemic. Inflammation and oxidative damage are also contributing factors to diabetes, metabolic syndrome, cancer and many other diseases.

So how does vitamin D work its magic? It acts as a potent hormone in more than a dozen types of tissues and cells in the body, regulating expression of essential genes and rapidly activating already expressed enzymes and proteins. In the heart, vitamin D binds to specific vitamin D receptors and produces its “calming”, protective effects.

There are essentially three ways to obtain vitamin D: exposure to UV light, food and supplements. The most effective of all of these methods is exposure to sunlight. Full-body exposure of pale skin to summer sunshine for 30 minutes without clothing or sunscreen can result in the synthesis of between 10,000 and 20,000 IU of vitamin D. At most latitudes outside of the tropics, however, there are substantial portions of the year during which vitamin D cannot be obtained from sunlight; additionally, environmental factors including pollution and the presence of buildings can reduce the availability of UVB light.

In northern latitudes or during winter months when the sun isn’t shining, I recommend taking 1 tsp./day of high-vitamin cod liver oil (Green Pasture or Radiant Life are two brands I recommend) to ensure adequate vitamin D (and vitamin A) intake. You can also eat vitamin D-rich foods such as herring, duck eggs, bluefin tuna, trout, eel, mackerel, sardines, chicken eggs, beef liver and pork. If you follow this approach further supplementation should not be necessary.

Before closing, I must mention (briefly) the issue of vitamin D toxicity. Vitamin D is widely considered to be the most toxic of all vitamins, and dire warnings are often issued to avoid excess sun exposure and vitamin D in the diet on that basis. The discussion of vitamin D toxicity has failed to take into account the interaction between vitamins A, D and K. Several lines of evidence suggest that vitamin D toxicity actually results from a relative deficiency of vitamins A and K.
So, the solution is not to avoid sun exposure or sources of vitamin D in the diet. Rather, it ensure adequate vitamin D intake (through sunlight and food) and to increase the intake (through diet and/or supplements) of vitamins A & K. Stay tuned for a future post on the interaction between vitamins A, D & K and their relevance to human health.

THS recommendations:

  • Throw away your sunscreen. Use coconut and sesame oil if needed, and moderate your exposure to sun to avoid frequent sunburn.
  • Get an hour or two of exposure to sunlight each day if possible. Don’t cover your skin (or your child’s skin) completely when out in the sun.
  • In northern latitudes or during winter months when the sun isn’t shining, take 1 tsp./day of high-vitamin cod liver oil (Green Pasture or Radiant Life are two brands I recommend) to ensure adequate vitamin A & D intake.
  • Eat vitamin D-rich foods such as herring, duck eggs, bluefin tuna, trout, eel, mackerel, sardines, chicken eggs, beef liver and pork.
  • Make sure to eat enough vitamin K. Primary sources in the diet are natto, hard and soft cheeses, egg yolks, sauerkraut, butter and other fermented foods. Make sure to choose dairy products from grass-fed animals if possible.

Suggested Links

  • The Vitamin D Miracle: Is it For Real?
  • From Seafood to Sunshine: A New Understanding of Vitamin D Safety
  • Vitamin D Toxicity Redefined

cheeseA study recently published by the European Prospective Investigation into Cancer and Nutrition (EPIC) has revealed that increased intake of vitamin K2 may reduce the risk of prostate cancer by 35 percent. The authors point out that the benefits of K2 were most pronounced for advanced prostate cancer, and, importantly, that vitamin K1 did not offer any prostate benefits.

The findings were based on data from more than 11,000 men taking part in the EPIC Heidelberg cohort. It adds to a small but fast-growing body of science supporting the potential health benefits of vitamin K2 for bone, cardiovascular, skin, brain, and now prostate health.

Unfortunately, many people are not aware of the health benefits of vitamin K2. The K vitamins have been underrated and misunderstood up until very recently in both the scientific community and the general public.

It has been commonly believed that the benefits of vitamin K are limited to its role in blood clotting. Another popular misconception is that vitamins K1 and K2 are simply different forms of the same vitamin - with the same physiological functions.

New evidence, however, has confirmed that vitamin K2’s role in the body extends far beyond blood clotting to include protecting us from heart disease, ensuring healthy skin, forming strong bones, promoting brain function, supporting growth and development and helping to prevent cancer - to name a few. In fact, vitamin K2 has so many functions not associated with vitamin K1 that many researchers insist that K1 and K2 are best seen as two different vitamins entirely.

A large epidemiological study from the Netherlands illustrates this point well. The researchers collected data on the vitamin K intakes of the subjects between 1990 and 1993 and measured the extent of heart disease in each subject, who had died from it and how this related to vitamin K2 intake and arterial calcification. They found that calcification of the arteries was the best predictor of heart disease. Those in the highest third of vitamin K2 intakes were 52 percent less likely to develop severe calcification of the arteries, 41 percent less likely to develop heart disease, and 57 percent less likely to die from it. (Geleijnse et al., 2004, pp. 3100-3105) However, intake of vitamin K1 had no effect on cardiovascular disease outcomes.

While K1 is preferentially used by the liver to activate blood clotting proteins, K2 is preferentially used by other tissues to deposit calcium in appropriate locations, such as in the bones and teeth, and prevent it from depositing in locations where it does not belong, such as the soft tissues.(Spronk et al., 2003, pp. 531-537) In an acknowledgment of the different roles played by vitamins K1 and K2, the United States Department of Agriculture (USDA) finally determined the vitamin K2 contents of foods in the U.S. diet for the first time in 2006. (Elder, Haytowitz, Howe, Peterson, & Booth, 2006, pp. 436-467)

Another common misconception is that human beings do not need vitamin K2 in their diet, since they have the capacity to convert vitamin K1 to vitamin K2. The amount of vitamin K1 in typical diets is ten times greater than that of vitamin K2, and researchers and physicians have largely dismissed the contribution of K2 to nutritional status as insignificant.

However, although animals can convert vitamin K1 to vitamin K2, a significant amount of evidence suggests that humans require preformed K2 in the diet to obtain and maintain optimal health. The strongest indication that humans require preformed vitamin K2 in the diet is that epidemiological and intervention studies both show its superiority over K1. Intake of K2 is inversely associated with heart disease in humans while intake of K1 is not (Geleijnse et al., 2004, pp. 3100-3105), and vitamin K2 is at least three times more effective than vitamin K1 at activating proteins related to skeletal metabolism. (Schurgers et al., 2007) And remember that in the study on vitamin K2’s role in treating prostate cancer, which I mentioned at the beginning of this article, vitamin K1 had no effect.

All of this evidence points to the possibility that vitamin K2 may be an essential nutrient in the human diet. So where does one find vitamin K2 in foods? The following is a list of the foods highest in vitamin K2, as measured by the USDA:

Foods high in vitamin K2

  • Natto
  • Hard cheese
  • Soft cheese
  • Egg yolk
  • Butter
  • Chicken liver
  • Salami
  • Chicken breast
  • Grond beef

Unfortunately, precise values for some foods that are likely to be high in K2 (such as organ meats) are not available at this time. The pancreas and salivary glands would be richest; reproductive organs, brains, cartilage and possibly kidneys would also be very rich; finally, bone would be richer than muscle meat. Fish eggs are also likely to be rich in K2.

It was once erroneously believed that intestinal bacteria are a major contributor to vitamin K status. However, the majority of evidence contradicts this view. Most of the vitamin K2 produced in the intestine are embedded within bacterial membranes and not available for absorption. Thus, intestinal production of K2 likely makes only a small contribution to vitamin K status. (Unden & Bongaerts, 1997, pp. 217-234)

On the other hand, fermented foods, however, such as sauerkraut, cheese and natto (a soy dish popular in Japan), contain substantial amounts of vitamin K2. Natto contains the highest concentration of K2 of any food measured; nearly all of it is present as MK-7, which research has shown to be a highly effective form. A recent study demonstrated that MK-7 increased the percentage of osteocalcin in humans three times more powerfully than did vitamin K1. (Schurgers & Vermeer, 2000, pp. 298-307)

It is important to note that commercial butter is not a significantly high source of vitamin K2. Dr. Weston A. Price, who was the first to elucidate the role of vitamin K2 in human health (though he called it “Activator X” at the time) analyzed over 20,000 samples of butter sent to him from various parts of the world. As mentioned previously in this paper, he found that the Activator X concentration varied 50-fold. Animals grazing on vitamin K-rich cereal grasses, especially wheat grass, and alfalfa in a lush green state of growth produced fat with the highest amounts of Activator X, but the soil in which the pasture was grown also influenced the quality of the butter. It was only the vitamin-rich butter grown in three feet or more of healthy top soil that had such dramatic curing properties when combined with cod liver oil in Dr. Price’s experiments and clinical practice.

Therefore, vitamin K2 levels will not be high in butter from grain-fed cows raised in confinement feedlots. Since the overwhelming majority of butter sold in the U.S. comes from such feedlots, butter is not a significant source of K2 in the diet for most people. This is yet another argument for obtaining raw butter from cows raised on green pasture.

New research which expands our understanding of the many important roles of vitamin K2 is being published at a rapid pace. Yet it is already clear that vitamin K2 is an important nutrient for human health - and one of the most poorly understood by medical authorities and the general public.

Recommended links

  • On the Trail of the Elusive X-Factor
  • The Vitamin You Need to Prevent Prostate Cancer
  • K2 Associated with Reduced Risk of Coronary Heart Disease

Exposure to sunlight prevents melanoma.

Yes, you did read that correctly.

Two independent studies published in the Feb. 2005 issue of the prestigious Journal of the National Cancer Institute (JNCI) squarely contradict the popular myth that UV light causes melanoma.

The first study evaluated the hypothesis that UV radiation increases your risk of developing lymphoma - a hypothesis that had become widely accepted in the 1990s and early 2000s. After studying nearly 7,000 subjects, the authors concluded that the opposite is actually true: increased sun exposure reduces the risk of non-Hodgkin’s lymphoma (NHL) by up to 40%. What’s more, the reduction in risk was dose-related, which means that the more sun exposure someone got, the lower their risk of cancer was.

The second study looked at the link between sun exposure and the chances of surviving melanoma, which is the deadliest form of skin cancer. Guess what? The researchers concluded that increased sun exposure decreases the chance of dying from skin cancer by approximately 50%.

At this point you might be scratching your head and wondering how this could possibly be true, in light of what we’ve been told all these years about the relationship between sunlight and skin cancer. Let’s take a closer look at what explains this phenomenon, and why you likely haven’t heard about it on the news.

Clarification

An editorial published in the same issue of JNCI begins with this statement:

“Solar radiation is a well-established skin carcinogen, responsible for more cancers worldwide than any other single agent.”

This is true. But what the authors neglect to mention is that the type of cancer they are referring to is not melanoma but other types of cancer. Melanoma is the most serious form of skin cancer because it is malignant and can metastasize (spread) to other areas of the body, often leading to death.

But 90 percent of skin cancers are not melanomas. Rather, the most common forms are basal and squamous cell carcinomas, which are often benign and easily cured by simple outpatient surgery. These non-malignant forms of skin cancer are indeed caused by solar radiation (at least according to current research). Melanomas, however, are most likely caused by lack of sunlight or excess exposure to artificial light!

The editorial mentioned two other very important facts that you aren’t likely to hear about from mainstream media sources: that melanoma is normally found in areas of the body that are not typically exposed to sunlight at all (use your imagination), and that vitamin D may be important in preventing melanoma.

Here’s what they actually had to say:

“Evidence is beginning to emerge that sunlight exposure, particularly as it relates to vitamin D synthesized in the skin under the influence of solar radiation, might have a beneficial influence for certain cancers.”

Umm, like, we already knew that.

The role of Vitamin D

It has been known for several years that sun exposure might have a beneficial effect on certain cancers. A 1999 publication of the National Institute of Health (NIH) entitled Atlas of Cancer Mortality in the United States revealed that among caucasians in the United States, cancer mortality for several prominent cancers, including cancer of the breast, prostate and colon, shows a striking latitudinal gradient. Specifically, people living in northern states have much higher rates of these cancers than those residing in the southern states.

The reason for this? Northern states get a whole lot less sunshine than southern states.

As early as 1990 it was proposed that vitamin D, which is synthesized in the skin upon exposure to UV light, might be the agent that accounts for these geographical patterns. (Garland et al. 1990) Less exposure to sunshine means less production of vitamin D. It is known that calcitriol, the active form of vitamin D3, has multiple cellular affects that could confer protection against cancer. The ability to convert the precursor to vitamin D to the active form of D3 (calcitriol) is greatly reduced at northern latitudes, and populations living far from the equator are at increased risk of vitamin D deficiency during the winter months. (Tangpricha et al. 2002)

Even more significant may be the observation that patients with malignant melanoma exhibit low levels of vitamin D3 in their blood, and that others have a problem with the receptor for vitamin D. (Hutchinson et al. 2000; Green et al. 1983) The incidence of melanoma of the skin on sites of the body intermittently exposed to sunlight is reduced among outdoor workers compared with indoor workers. (Elwood et al. 1985)

All of this points to a protective role for vitamin D against cancer in general, and melanoma in particular. But the final nail in the coffin of the “sunlight causes melanoma” hypothesis is this:

A comprehensive review of research studies from 1966 through 2003 failed to show any association between melanoma and sunscreen use! (Dennis et al. 2003)

Say what? Sunscreen doesn’t prevent skin cancer, that’s what.

Does sunscreen contribute to skin cancer?

One thing sunlight does cause is an injury to the inner layer of the skin (called the “dermis”), which leads to a wrinkling of the outer layer (called the “epidermis”). This phenomenon, which happens naturally with age but is accelerated by sun exposure, is called “solar elastosis”, or SE.

Sounds like a bad thing, right? But when researchers at the University of New Mexico studied melanoma, they found a marked decrease in the disease in patients with SE. (Berwick et al. 2005). To put it simply: more sun exposure equals lower risk of melanoma. For patients who already had melanoma, the subsequent death rate from the disease was approximately one-half as high in the group of patients with signs of SE.

I’ll give you a minute to finish cursing the “medical authorities” that have been admonishing us to slather ourselves and our children with sunscreen for decades in order to “prevent skin cancer”. As it turns out, if we followed this advice (and why wouldn’t we have? It sounded logical…) we have actually increased our chances and our children’s chances of developing not just skin cancer, but other cancers as well.

I’m sorry to scare you like that, but I feel I must in order to make this point as clearly as I can:

Exposure to sunlight decreases your risk of cancer, and using sunscreen increases your risk of cancer.

As we have already discussed, sunlight is a major source of vitamin D. Insufficient levels of vitamin D can result in osteoporosis, autoimmune diseases and rheumatoid arthritis - among other equally unpleasant and life-threatening conditions. When you put on those high-SPF sunscreens, not only are you increasing your risk for melanoma, you are increasing your risk of developing all of the conditions that can arise from vitamin D deficiency because you are blocking your body’s ability to synthesize vitamin D.

And while it is possible to obtain vitamin D from food, it is only present in large amounts in certain kinds of seafood - which many people do not consume regularly. The highest sources for vitamin D in food are anglerfish liver, cow’s blood (I’m not joking) and high-vitamin cod liver oil (HVCLO). It is also present in more modest amounts in chum salmon, Pacific marlin, herring, bluefin tuna, duck eggs, trout, eel, mackerel and salmon.

I’m going to go out on a limb and guess that most Americans aren’t eating these foods on a regular basis. The lack of adequate intake of vitamin D in the diet, combined with habitual use of high-SPF sunscreen and/or lack of exposure to the sun is a perfect recipe for increasing the risk of cancer for children and adults alike.

But you will not hear the sunscreen manufacturers telling you to stop using their product, and you probably won’t hear it from dermatologists in the field who have a reputation (and a history of telling people to wear sunscreen) to protect. They’ll tell you that sunburn is an important factor in melanoma formation since that’s really all they have left in terms of support for selling sunscreen. What they neglect to mention is that 1) millions of people get sunburned every year but very few develop melanoma, and more importantly, 2) if melanoma does appear, it’s most likely to appear in areas not exposed to the sun.

Nevertheless, it’s still probably a good idea to avoid getting sunburned - especially on a regular basis. But it is not a good idea to wear sunscreen, nor is it a good idea to avoid sun exposure.

The idea that sunlight causes cancer and sunscreen prevents it is another mainstream myth that has no support in the scientific literature. Just like the idea that cholesterol causes heart disease, eating fat makes you fat, and fluoride is good for your teeth. (If you still believe any of those statements, check my archives and sign up for my free email digest!)

Before closing, I must mention (briefly) the issue of vitamin D toxicity. Vitamin D is widely considered to be the most toxic of all vitamins, and dire warnings are often issued to avoid excess sun exposure and vitamin D in the diet on that basis. The discussion of vitamin D toxicity has failed to take into account the interaction between vitamins A, D and K. Several lines of evidence suggest that vitamin D toxicity actually results from a relative deficiency of vitamins A and K.
So, the solution is not to avoid sun exposure or sources of vitamin D in the diet. Rather, it ensure adequate vitamin D intake (through sunlight and food) and to increase the intake (through diet and/or supplements) of vitamins A & K. Stay tuned for a future post on the interaction between vitamins A, D & K and their relevance to human health.

In the meantime, this is what I recommend for protecting against cancer and both deficiency and toxicity of vitamin D:

THS recommendations:

  • Throw away your sunscreen. It contributes to cancer.
  • Get an hour or two of exposure to sunlight each day if possible. Don’t cover your skin (or your child’s skin) completely when out in the sun.
  • Avoid frequent sunburn
  • In northern latitudes or during winter months when the sun isn’t shining, take 1 tsp./day of high-vitamin cod liver oil (Green Pasture or Radiant Life are two brands I recommend) to ensure adequate vitamin A & D intake. You can also eat vitamin D-rich foods such as herring, duck eggs, bluefin tuna, trout, eel, mackerel, sardines, chicken eggs, beef liver and pork.
  • Make sure to eat enough vitamin K. Primary sources in the diet are natto, hard and soft cheeses, egg yolks, sauerkraut, butter and other fermented foods. Make sure to choose dairy products from grass-fed animals if possible.

As always, leave a comment or contact me with questions!

Bad Behavior has blocked 1274 access attempts in the last 7 days.