Can You Improve Your Healthspan?

Can You Live Healthier, Longer?

Ever since Ponce de Leon led an expedition to the Florida coast in 1513, we have been searching for the mythical “Fountain Of Youth”. What does that myth mean?

Supposedly, just by immersing yourself in that fountain you would be made younger. You would experience all the exuberance and health you enjoyed when you were young. There have been many snake oil remedies over the years that have promised that. They were all frauds.

But what if you had it in your power to live longer and to retain your youthful health for most of those extra years. The ability to live healthier longer is something that scientists call “healthspan”. But you can think of it as your personal “Fountain Of Youth”.

Where are we as a nation? Americans ranked 53rd in the world for life expectancy. We have the life expectancy of a third-world country. We are in sore need of a “Fountain Of Youth”.

That is why I decided to share two recent studies from the prestigious Harvard T.H. Chan School of Public Health with you today.

How Were The Studies Done?

Clinical StudyThese studies started by combining the data from two major clinical trials:

  • The Nurse’s Health Study, which ran from 1980 to 2014.
  • The Health Professional’s Follow-Up Study, which ran from 1986-2014.

These two clinical trials enrolled 78,865 women and 42,354 men and followed them for an average of 34 years. During this time there were 42,167 deaths. All the participants were free of heart disease, type 2 diabetes, and cancer at the time they were enrolled. Furthermore, the design of these clinical trials was extraordinary.

  • A detailed food frequency questionnaire was administered every 2-4 years. This allowed the investigators to calculate cumulative averages of all dietary variables.
  • Participants also filled out questionnaires that captured information on disease diagnosis every 2 years with follow-up rates >90%. This allowed the investigators to measure the onset of disease for each participant during the study. More importantly, 34 years is long enough to measure the onset of diseases like heart disease, diabetes, and cancer – diseases that require decades to develop.
  • The questionnaires also captured information on medicines taken and lifestyle characteristics such as body weight, exercise, smoking and alcohol use.
  • For analysis of diet quality, the investigators use something called the “Alternative Healthy Eating Index”. [The original Healthy Eating Index was developed about 10 years ago based on the 2010 “Dietary Guidelines for Americans”. Those guidelines have since been updated, and the “Alternative Healthy Eating Index” is based on the updated guidelines.] You can calculate your own Alternative Healthy Eating Index below, so you can see what is involved.
  • Finally, the investigators included five lifestyle-related factors – diet, smoking, physical activity, alcohol consumption, and BMI (a measure of obesity) – in their estimation of a healthy lifestyle. Based on the best available evidence, they defined “low-risk” in each of these categories. Study participants were assigned 1 point for each low-risk category they achieved. Simply put, if they were low risk in all 5 categories, they received a score of 5. If they were low risk in none of the categories, they received a score of 0.
  • Low risk for each of these categories was defined as follows:
    • Low risk for a healthy diet was defined as those who scored in the top 40% in the Alternative Healthy Eating Index.
    • Low risk for smoking was defined as never smoking.
    • Low risk for physical activity was defined as 30 minutes/day of moderate or vigorous activities.
    • Low risk for alcohol was defined as 0.5-1 drinks/day for women and 0.5-2 drinks/day for men.
    • Low risk for weight was defined as a BMI in the healthy range (18.5-24.9 kg/m2).

Can You Live Healthier Longer?

Older Couple Running Along BeachThe investigators compared participants who scored as low risk in all 5 categories with participants who scored as low risk in 0 categories (which would be typical for many Americans). For the purpose of simplicity, I will refer to people who scored as low risk in 5 categories as having a “healthy lifestyle” and those who scored as low risk in 0 categories as having an “unhealthy lifestyle”.

The results of the first study were:

  • Women who had had a healthy lifestyle lived 14 years longer than women with an unhealthy lifestyle (estimated life expectancy of 93 versus 79).
  • Men who had a healthy lifestyle lived 12 years longer than men with an unhealthy lifestyle (estimated life expectancy was 87 versus 75).
  • It was not necessary to achieve a perfect lifestyle. Life expectancy increased in a linear fashion for each low-risk lifestyle behavior achieved.

The authors of the study concluded: “Adopting a healthy lifestyle could substantially reduce premature mortality and prolong life expectancy in US adults. Our findings suggest that the gap in life expectancy between the US and other developed countries could be narrowed by improving lifestyle factors.”

The results of the second study were:

  • Women who had a healthy lifestyle lived 11 years longer free of diabetes, heart disease, and cancer than women who had an unhealthy lifestyle (estimated disease-free life expectancy of 85 years versus 74 years).
  • Men who had a healthy lifestyle lived 8 years longer free of diabetes, heart disease, and cancer than men who had an unhealthy lifestyle (estimated disease-free life expectancy of 81 years versus 73 years).
  • Again, disease-free life expectancy increased in a linear fashion for each low-risk lifestyle behavior achieved.

The authors concluded: “Adherence to a healthy lifestyle at mid-life [They started their analysis at age 50] is associated with a longer life expectancy free of major chronic diseases. Our findings suggest that promotion of a healthy lifestyle would help reduce healthcare burdens through lowering the risk of developing multiple chronic diseases, including cancer, cardiovascular disease, and diabetes, and extending disease-free life expectancy.”

Can You Improve Your Healthspan?

Questioning ManI posed the question at the beginning of this article, “Can you improve your healthspan?” These two studies showed that you can improve both your life expectancy and your disease-free life expectancy. So, the answer to the original question appears to be, “Yes, you can improve your healthspan. You can create your personal “Fountain of Youth.”

However, as a nation we appear to be moving in the wrong direction. The percentage of US adults adhering to a healthy lifestyle has decreased from 15% in 1988-1992 to 8% in 2001-2006.

The clinical trials that these studies drew their data from were very well designed, so these are strong studies. However, like all scientific studies, they have some weaknesses, namely:

  • They looked at the association of a healthy lifestyle with life expectancy and disease-free life expectancy. Like all association studies, they cannot prove cause and effect.
  • The clinical trials they drew their data with included mostly Caucasian health professionals. The results may differ with different ethnic groups.
  • These studies did not look at the effect of a healthy lifestyle on the onset of Alzheimer’s disease and other forms of dementia. However, other studies have shown that people who were low risk for each of the 5 lifestyle factors (diet, exercise, body weight, smoking, and alcohol use) individually have a reduced risk of developing Alzheimer’s and/or dementia.

Finally, I know you have some questions, and I have answers.

Question: What about supplementation? Will it also improve my healthspan?

Answer: When the investigators analyzed the data, they found that those with the healthiest lifestyles were also more likely to be taking a multivitamin. So, they attempted to statistically eliminate any effect of supplement use on the outcomes. That means these studies cannot answer that question.

However, if you calculate your Alternate Healthy Eating Index below, you will see that most of us fall short of perfection. Supplementation can fill in the gaps.

Question: I cannot imagine myself reaching perfection in all 5 lifestyle categories? Should I even try to achieve low risk in one or two categories?

Answer: The good news is that there was a linear increase in both life expectancy and disease-free life expectancy as people went from low-risk in one category to low-risk in all 5 categories. I would encourage you to try and achieve low risk status in as many categories as possible, but very few of us, including me, achieve perfection in all 5 categories.

Question: I am past 50 already. Is it too late for me to improve my healthspan?

Answer: Diet and some of the other lifestyle behaviors were remarkably constant over 34 years in both the Nurse’s Health Study and the Health Professional’s Follow-Up Study. That means that the lifespan and healthspan benefits reported in these studies probably resulted from adhering to a healthy lifestyle for most of their adult years.

However, it is never too late to start improving your lifestyle. You may not achieve the full benefits described in these studies, but you still can add years and disease-free years to your life.

How To Calculate Your Alternative Healthy Eating Index

You can calculate your own Alternative Healthy Eating Index score by simply adding up the points you score for each food category below.

Vegetables

Count 2 points for each serving you eat per day (up to 5 servings).

One serving = 1 cup green leafy vegetables or ½ cup for all other vegetables.

Do not count white potatoes or processed vegetables like French fries or kale chips.

Fruits

Count 2½ points for each serving you eat per day (up to 4 servings).

One serving = 1 piece of fruit or ½ cup of berries.

          (do not count fruit juice or fruit incorporated into desserts or pastries). 

Whole Grains

Count 2 points for each serving you eat per day (up to 5 servings).

One serving = ½ cup whole-grain rice, bulgur and other whole grains, cereal, and pasta or 1 slice of bread.

(For processed foods like pasta and bread, the label must say 100% whole grain).

Sugary Drinks and Fruit Juice

Count 10 points if you drink 0 servings per week.

Count 5 points for 3-4 servings per week (½ serving per day).

Count 0 points for 7 or more servings per week (≥1 serving per day).

One serving = 8 oz. fruit juice, sugary soda, sweetened tea, coffee drink, energy drink, or sports drink.

Nuts, Seeds and Beans

Count 10 points if you eat 7 or more servings per week (≥1 serving per day).

Count 5 points for 3-4 servings per week (½ serving per day).

Count 0 points for 0 servings per week.

One serving = 1 oz. nuts or seeds, 1 Tbs. peanut butter, ½ cup beans, 3½ oz. tofu.

Red and Processed Meat

Count 10 points if you eat 0 servings per week.

Count 7 points for 3-4 servings per week (½ serving per day).

Count 3 points for 3 servings per week (1 serving per day).

Count 0 points for ≥1½ servings per day.

One serving = 1½ oz. processed meats (bacon, ham, sausage, hot dogs, deli meat)

          Or 4 oz. red meat (steak, hamburger, pork chops, lamb chops, etc.)

Seafood

Count 10 points if you eat 2 servings per week.

Count 5 points for 1 serving per week.

Count 0 points for 0 servings per week.

1 serving = 4 oz.

Now that you have your total, the scoring system is:

  • 41 or higher is excellent
  • 37-40 is good
  • 33-36 is average (remember that it is average to be sick in this country)
  • 28-32 is below average
  • Below 28 is poor

Finally, for the purposes of these two studies, a score of 37 or higher was considered low risk.

The Bottom Line

Two recent studies have developed a healthy lifestyle score based on diet, exercise, body weight, smoking, and alcohol use. When they compared the effect of lifestyle on both lifespan (life expectancy) and healthspan (disease-free life expectancy), they reported:

  • Women who had had a healthy lifestyle lived 14 years longer than women with an unhealthy lifestyle.
  • Men who had a healthy lifestyle lived 12 years longer than men with an unhealthy lifestyle.
  • Women who had a healthy lifestyle lived 11 years longer free of diabetes, heart disease, and cancer than women had an unhealthy lifestyle.
  • Men who had a healthy lifestyle lived 8 years longer free of diabetes, heart disease, and cancer than men who had an unhealthy lifestyle.
  • It is not necessary to achieve a perfect lifestyle. Lifespan and healthspan increased in a linear fashion for each low-risk lifestyle behavior (diet, exercise, body weight, smoking, and alcohol use) achieved.
  • These studies did not evaluate whether supplement use also affects healthspan.
    • However, if you calculate your diet with the Alternate Healthy Eating Index they use (see above), you will see that most of us fall short of perfection. Supplementation can fill in the gaps.

The authors concluded: “Our findings suggest that promotion of a healthy lifestyle would help reduce healthcare burdens through lowering the risk of developing multiple chronic diseases, including cancer, cardiovascular disease, and diabetes, and extending disease-free life expectancy.”

For more details, including how to calculate whether you are low risk in each of the 5 lifestyle categories, read the article above.

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure, or prevent any disease.

Which Foods Should I Avoid?

What Is Nutritionism?

In Defense Of FoodRecently, I have been reading Michael Pollan’s book “In Defense of Food”. Yes, I know the book has been around for a long time. Normally I read the scientific literature rather than popular health books. However, in the past few weeks I have had a lot more time to read books, so I decided to read this one.

Some of the things he says are “off the wall”. As he readily admits, he isn’t a scientist or a medical doctor. However, a lot of what he says is “right on”. He echoes many of the things I have been talking about for years. But he does a masterful job of pulling everything together into a framework he calls “nutritionism”.

If you have a chance, I highly recommend that you read his book.

I will briefly summarize his discussion of nutritionism below. I will also share some scientific support for what he is saying. Finally, I will close by sharing what the Bible says on the subject.

What Is Nutritionism?

Low Fat LabelSimply put, nutritionism is the belief that we can understand food solely in terms of its nutritional and chemical constituents and our requirements for them. I use the term “belief” purposely. As Michael Pollan puts it: “As the ‘-ism’ suggests, nutritionism is not a scientific subject, but an ideology.”

What Michael Pollan is referring to is taking food constituents like saturated fats, cholesterol, sugar, carbohydrates, polyunsaturated fats, monounsaturated fats, fiber, antioxidants, and probiotics and labeling them as either “good” or “bad”.

As he points out, that leads to debacles like the creation of margarine as a substitute for butter. Of course, everyone reading this article knows that we subsequently found out that the trans fat in margarine was worse for us than the saturated fat in butter. He offers many other examples like this.

He also points out that the nutritionism concept has given free rein to the food industry to replace whole foods with processed foods that are cholesterol-free, sugar-free, low-fat, low-carb, or high in fiber, omega-3s, etc. He says that these foods are seldom healthier than the foods they replace. I agree.

Finally, he points out that the scientific support for the classification of individual ingredients or foods as “good” or “bad” is weak. That’s because when scientists design a study that removes a chemical constituent or a food from the diet, they have to replace it with something. And what they replace it with determines the outcome of the study. I give some examples of this in the next section.

The essence of Michael Pollan’s message is:

  • The effect of an individual nutrient or chemical constituent on your health depends on the food it is found in. Forget the fancy nutrition labels. Whole foods are almost always healthier than processed foods.
  • The effect of a food or food constituent on your health also depends on your overall diet. We should be thinking about healthy diets rather than the latest “magical” or “forbidden” food.

I will discuss these points below.

Which Foods Should I Avoid?

Question MarkNow, let’s get to the question, “Which Foods Should I Avoid?” If we are talking about whole foods, the short answer is “None”. As I said in my book, “Slaying The Food Myths”, “We have 5 food groups for a reason”.

For example, if we are talking about plant foods, each plant food group:

  • Has a unique blend of vitamins and minerals.
  • Has a unique blend of phytonutrients.
  • Has a unique blend of fiber.
  • Supports the growth of a unique combination of beneficial gut bacteria.
  • Dr Strangelove and his friends are telling you to eliminate whole grains, fruits, and legumes (beans) from your diet. Recent studies suggest that might not be a good idea. Here is one example.

If we are talking about animal foods, each animal food group:

  • Has a unique blend of vitamins and minerals.
  • May have unique components that are important for our health. [Note: This is an active area of research. Theories have been proposed for which components in animal foods may be important for our health, but they have not been confirmed.]
  • Vegan purists will tell you that you have no need for meat and dairy foods. Recent studies suggest otherwise. Here is one example.

With that as background, let’s turn our attention to nutritionism and look at some of science behind claims that certain food components are either good for us or bad for us.

Saturated Fat. Saturated fat is the poster child for nutritionism.lowfat

First, we were told by the American Heart Association and other health organizations that saturated fat was bad for us. Recently Dr. Strangelove and his friends are telling us that saturated fat is good for us. Instead of limiting saturated fat, we should be limiting carbs by cutting out fruits, whole grains, and legumes. Both cite clinical studies to support their claims. How can this be?

Perhaps a little history is in order. When the American Heart Association recommended that we decrease intake of saturated fat, they were envisioning that we would replace it with monounsaturated and polyunsaturated fat in the context of a healthy diet of fruits, vegetables, whole grains, and legumes. That never happened.

Big Food quickly realized that if the American public were to follow the AHA guidelines, it would be disastrous for their bottom line. So, they sprang into action. They mixed sugar, white flour, and a witch’s brew of chemicals to create highly processed, low fat “foods”. Then they told the American public, “Don’t worry. You don’t have to give up your favorite foods. We have created low fat alternatives.”

This is the essence of what Michael Pollan refers to as nutritionism. By marketing their fake foods as low fat Big Food created the halo of health. In fact, Big Food’s fake foods were less healthy than the foods they replaced. Americans got fatter and sicker.

Now let’s look at the conflicting claims that saturated fat is bad for us or good for us. How can clinical studies disagree on such an important question? The answer is simple. It depends on what you replace it with. You need to consider saturated fat intake in the context of the overall diet.

I discussed this in a previous issue of “Health Tips From the Professor”, but let me summarize it briefly here. The American Heart Association tells us that replacing half of the saturated fat in a typical American diet with:

  • Trans fats, increases heart disease risk by 5%.
  • Refined carbohydrates and sugars (the kind of carbohydrates in the typical American Diet), slightly increases heart disease risk.
  • Complex carbohydrates (whole grains, fruits & vegetables), decreases heart disease risk by 9%.
  • Monounsaturated fats (olive oil & peanut oil), decreases heart disease risk by 15%.
  • Polyunsaturated fats (vegetable oils and fish oil), decreases heart disease risk by 25%.
  • Unsaturated fats in the context of a Mediterranean diet, decreases heart disease risk by 45%.

My advice: Saturated fat is neither good for you nor bad for you. A little bit of saturated fat in the context of a healthy diet is fine. A lot of saturated fat in the context of an unhealthy diet is problematic.

fatty steakRed Meat. Is red meat bad for you? Like saturated fat, it depends on the amount of red meat and the overall diet. I covered this in detail in “Slaying The Food Myths”, but let me summarize briefly here:

According to the World Health Organization, red meat is a probable carcinogen. If we look at the postulated mechanisms by which it causes cancer, they can be mostly neutralized by components of various plant foods.

My advice: An 8-ounce steak with fries and a soda is probably bad for you. Three ounces of that same steak in a green salad or stir fry may be good for you.

I should make one other point while I am on the topic. Dr. Strangelove and his friends have been telling you that grass-fed beef is better for you than conventionally raised beef. Once again, that is nutritionism.  Grass-fed beef is lower in saturated fat and high in omega-3s than conventionally raised beef. That may be better for your heart, but it has no effect on the cancer-causing potential of red meat. It doesn’t give the license to eat 8-ounce steaks on a regular basis. You still want to aim for 3-ounces of that grass-fed beef in a green salad or stir fry. 

High-Fructose Corn Syrup. This one seems to be on everyone’s “naughty list”. You are being told to read labels, and if the food has high-fructose corn syrup on the label, put it back on the shelf. But is that good advice?

It turns out that all the studies on the bad effects of high-fructose corn syrup have been done with sodas and highly processed foods. This should be your first clue.

Of course, as soon as high-fructose corn syrup gained its “bad” reputation, Big Food started replacing it with Sugar Comparisons“heathier” sugars. Does that make those foods healthier?

The answer is a clear “No”. Both chemically and biologically, high-fructose corn syrup is identical to sucrose (table sugar), honey, molasses, maple syrup, coconut sugar, date sugar, or grape juice concentrate. Agave sugar is even higher in fructose than high-fructose corn syrup. This is your second clue.

Substituting these sugars for high-fructose corn syrup doesn’t turn sodas and processed foods into health foods. This is nutritionism at its worst.

My advice: Forget reading the label. Forget trying to avoid foods with high-fructose corn syrup. Avoid sodas and processed foods instead.

Sugar. Once the public started to realize that natural sugars in processed foods were just as bad for us as high-fructose corn syrup, sugars became “bad”. We were told to avoid all foods containing sugar in any form. In fact, we were told we needed to become “label detectives” and recognize all the deceptive ways that sugar could be hidden on the label.

Apple With Nutrition LabelI have discussed this in detail in a previous issue of “Health Tips From The Professor”.

Let me just summarize that article with one quote, “It’s not the sugar. It’s the food. There is the same amount and same types of sugar in an 8-ounce soda and a medium apple. Sodas are bad for you, and apples are good for you.” If you are wondering why that is, I have covered it in another issue of “Health Tips From the Professor”.

Before leaving this subject, I should mention that nutritionism has risen its ugly head here as well. Big Food has struck again. They have replaced sugar with a variety of artificial sweeteners.

Once again, nutritionism has failed. Those artificially sweetened sodas and processed foods are no healthier and no more likely to help you keep the weight off than the sugar-sweetened foods they replace. I have covered the science behind that statement in several previous issues of “Health Tips From the Professor”. Here is one example.

My advice: Forget about sugar phobia. You don’t need to become a label detective. Just avoid sodas, sugar-sweetened beverages, and sweet processed foods. Get your sugar in its natural form in fruits and other whole foods.

low carb dietCarbs. Dr. Strangelove and his friends are now telling you that you need to avoid all carbs. That is pure nutritionism. Carbs are neither good nor bad. It depends on the type of carb and what you replace it with.

Once again, clinical studies have given conflicting outcomes. Each side of the carbohydrate debate can provide clinical studies to support their position. How can that be? The answer is simple. It depends on what assumptions went into the design of the clinical studies. I have written several articles on this topic in “Health Tips From the Professor”, but let me give you one example here.

In this example, I looked at two major studies. The PURE (Prospective Urban Rural Epidemiology) study included data from 135,000 participants in 18 countries. In this study, the death rate decreased as the % carbohydrate in the diet decreased. The low-carb enthusiasts were doing a victory dance.

However, it was followed by a second, even larger study. The ARIC (Atherosclerosis Risk In Communities) study included 432,000 participants from even more countries. In this study, the death rate decreased as the % carbohydrate decreased to about 40%. Then a curious thing happened. As the % carbohydrate in the diet decreased further, the death rate increased.

How can you explain this discrepancy? When you examine the PURE study:

  • The % carbohydrate only ranged from 70% to 40%.
  • The data for the PURE study was obtained primarily with third world countries. That is an important distinction because:
    • In those countries, it is primarily the well to do that can afford sodas, processed foods, and meat.
    • The poor subsist on what they can grow and inexpensive staples like beans and rice.
  • Simply put, in the PURE study, the type of carbohydrate changed as well as the amount of carbohydrate.
    • At the highest carbohydrate intakes, a significant percentage of the carbohydrate came from sugar and refined grains.
    • At the lowest carbohydrate intakes, most of the carbohydrate intake came from beans, whole grains, and whatever fruits and vegetables they could grow.

When you examine the ARIC study:how much carbohydrates should we eat aric

  • The % carbohydrate ranged from 70% to 20%.
  • The ARIC study added in data from the US and European countries. That is an important distinction because:
    • Low carb diets like Atkins and Keto are popular in these countries. And those are the diets that fall into the 20-40% carbohydrate range.
    • Most people can afford diets that contain a lot of meat in those countries.
  • Simply put, at the lower end of the scale in the ARIC study, people were eating diets rich in meats and saturated fats and eliminating healthy carbohydrate-containing foods like fruits, whole grains and legumes.

My advice: The lesson here is to avoid simplistic nutritionism thinking and focus on diets rather than on foods. When you do that it is clear that carbs aren’t bad for you, it’s unhealthy carbs that are bad for you.

Which Foods Should I Avoid? By now the answer to the question, “Which Foods Should I Avoid?” is clear. Avoid sodas, sugar-sweetened beverages and processed foods (The term processed foods includes convenience foods, junk foods, and most sweets).

What Does This Mean To You?

Questioning ManNow that we are clear on which foods you should avoid, let’s look at the flip side of the coin. Let’s ask, “Which foods should you include in your diet?

As I said at the beginning of this article, “We have 5 food groups for a reason”. We should consider whole foods from all 5 food groups as healthy.

Of course, each of us is different. We all have foods in some food groups that don’t treat us well. Some of us do better with saturated fats or carbs than others. We need to explore and find the foods and diets that work best for us.

However, whenever we assume one diet is best for everyone, we have crossed the line into nutritionism.

What Does The Bible Say?

Let me start this section by saying that I rely on the Bible for spiritual guidance rather than nutritional guidance. However, as part of our church’s Bible reading plan, I was reading 1 Timothy. A passage from 1 Timothy 4:1-5 leapt out at me. It reinforces the theme of Michael Pollan’s book and seems uniquely applicable to the times we live in.

“The Spirit clearly says that in later times some will abandon the faith and follow deceiving spirits and things taught by demons. Such teachings come through hypocritical liars, whose consciences have been seared as with a hot iron. They…order people to abstain from certain foods, which God created to be received with thanksgiving by those who believe and who know the truth. For everything God created is good, and nothing is to be rejected if it is received with thanksgiving, because it is consecrated by the word of God and prayer.”

Interesting.

The Bottom Line

In this article, I have discussed the concept of “nutritionism” introduced in Michael Pollan’s book “In Defense Of Food”. He defines nutritionism as the belief that we can understand food solely in terms of its nutritional and chemical constituents and our requirements for them.

What Michael Pollan is referring to is taking food constituents like saturated fats, cholesterol, sugar, carbohydrates, polyunsaturated fats, monounsaturated fats, fiber, antioxidants, and probiotics and labeling them as either “good” or “bad”. He points out that when we accept these simplistic labels, we often end up creating foods and diets that are less healthy than the ones we were trying to replace.

At the beginning of the article, I asked the question, “Which Foods Should I Avoid?” I then looked at several foods or food groups we have told to avoid, including saturated fats, red meat, high-fructose corn syrup, sugar, and carbs. When you look at the science behind these recommendations from the lens of nutritionism, you come to two conclusions:

  • We should avoid sodas, sugar-sweetened beverages and processed foods (The term processed foods includes convenience foods, junk foods, and most sweets).
  • Whole foods from all 5 food groups should be considered as healthy.

Of course, each of us is different. We all have foods in some food groups that don’t treat us well. Some of us do better with saturated fats or carbs than others. We need to explore and find the foods and diets that work best for us.

However, whenever we assume one diet is best for everyone, we have crossed the line into nutritionism.

For more details and a bible verse that supports the theme of Michael Pollan’s book and seems uniquely applicable to the times we live in, read the article above.

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure, or prevent any disease.

Personalized Nutrition To Change Your Life?

Author: Dr. Stephen Chaney

 

personalized nutritionCan a personalize nutrition assessment provide you with information to assist your health strategy?  We’ve been told that genetic testing is the wave of the future. We’ve been promised that genetic testing will tell us which diseases we are most likely to develop. Of course, the unspoken assumption is that if we knew which diseases were most likely to kill us, we’d be highly motivated to make the diet and lifestyle changes needed to reduce the risk of that disease.

But what if a personalized nutrition assessment based on a simple online diet survey was just as effective at getting us to make better food choices as all those fancy genetic tests? That is just what a recent study suggests.

How Was The Study Designed?

food4me surveyThe study was based on a simple online diet survey called Food4Me developed by University College Dublin and Crème Software Ltd. The Food4Me diet survey asks people how many times per week or per day they eat basic food groups and develops personalized diet recommendations based on what they are actually eating. It is a very simple, user friendly, survey requiring only 5-10 minutes to complete. Consumer satisfaction with this kind of survey is high. For example:

  • 92% of participants said that “the Food4Me website was easy to use.”
  • 76% of participants were “satisfied with the detail of information they received in their personalized nutrition report.”
  • 80% of participants felt that “the dietary advice in the report was relevant to them.”

In spite of its simplicity and ease of use, the Food4Me survey is also quite robust. Previous studies have shown that the reproducibility and validity of the Food4Me diet survey compares very favorably with much more extensive dietary analyses (For example, R. Fallaize, et al., Journal of Medical Internet Research, 16: e190, 2014).

This study (International Journal of Epidemiology, 2016, 1-11, doi:110.093/ije/dyw186)  measured the effectiveness of the Food4Me personalized nutrition reports at improving health-related behaviors. It was a 6-month randomized control study of 1269 adults from 7 European countries. It compared 4 different interventions on health-related behavior changes. The 4 interventions were:

  • standardized dietary advice
  • personalized nutrition advice based on the Food4Me survey
  • personalized nutrition advice based on the Food4Me survey plus BMI and blood biomarkers
  • personalized nutrition advice based on all that plus genetic testing

Is Personalized Nutrition The Wave Of The Future?

The results of the study were quite striking:

  • Compared to the group who just received standardized diet advice, the groups who received personalized nutrition advice were significantly more successful at improving health related behaviors. In particular, the groups receiving personalized nutrition advice:
    • personalized nutrition healthy foodConsumed less red meat.
    • Consumed less saturated fat
    • Consumed less salt
    • Got more folate from their diet
    • Had an improved “Healthy Eating Index” (a measure of overall diet quality)
  • Adding information on blood biomarkers (cholesterol, carotenoids, omega-3s, and vitamin D) and genotype received did not enhance the effectiveness of the personalized nutrition recommendations at changing health behaviors.

 

What Does This Study Mean For You?

This is a single study, but it does suggest several interesting take-home lessons.

#1: We are much more likely to follow diet advice that is personalized to us than we are to follow standardized diet advice. This should come as no surprise. We’ve had generalized diet advice like the USDA Food Guide Pyramid and, more recently, the USDA My Plate guidelines for decades, and they haven’t moved the needle. Maybe people think of generalized guidelines as applying to other people and personalized guidelines as applying to them.  Personalized nutrition seems to be more effective.

#2: This was personalized diet advice, not weird diet adviceThe participants were not being told to eat as much fat as they wanted. They weren’t being told that avoiding wheat will make them slimmer and smarter. They weren’t being told to eat like a caveman. They were being given USDA-approved diet recommendations. The only difference was that the dietary recommendations were personalized to them. For example, they were only being told to eat more fruits and vegetables if, in fact, fruits and vegetables were not a regular part of their daily diet. 

#3: Blood biomarkers did not provide any additional incentive to increase health related behaviors. I wouldn’t read too much into this observation. With the exception of cholesterol, the blood biomarkers selected for this study merely reinforced the diet analysis. For example, you could ask whether low blood carotenoid levels really provided any additional incentive to change their diet for an individual who was already told their intake of fruits and vegetables was low. If the study had measured disease-related blood biomarkers, it might have found that they provided additional incentive for individuals to make positive diet changes.

#4: Genetic testing did not provide any additional incentive to increase health related behaviors. This probably simply reflects the state of the science. Current genetic tests are only weakly predictive of major diseases like heart disease, diabetes, and cancer so they provide little incentive to make major lifestyle changes. This may change in the future as we improve our understanding of genetic influences on disease risks.

Missed Opportunities

This study clearly showed that a simple online diet survey like the Food4Me personalized diet assessment is very useful for changing health-related dietary behavior. However, this study also missed several opportunities to create an even more valuable tool for improving health-related behaviors. For example, the study collected data on obesity and activity levels, but did not attempt to provide personalized lifestyle recommendations based on that data. In addition, 44% of the participants reported that they had a disease, but no attempt was made to include health goals in the personalized diet and lifestyle recommendations.

 

The Bottom Line

  • A recent study showed that personalized nutrition recommendations based on a simple online survey were much more effective than standardized dietary advice at getting people to improve health-related eating habits.
  • Adding information on blood biomarkers and genetic tests did not enhance the effectiveness of the personalized nutrition recommendations at changing health behaviors.
  • The study did not evaluate the value of adding activity levels and health goals to the assessment. That perhaps represented a missed opportunity to create an even more powerful tool for positively influencing health-related behaviors.

 

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure or prevent any disease.

Skinny Fat

Overweight Vs. Obesity

Author: Dr. Stephen Chaney

skinny fatAre you skinny fat?  Weight loss season is upon us. Many of you are jumping on your bathroom scales so that you can decide how much weight you need to lose this year. For some the motivation for these New Year’s resolutions to lose weight is purely cosmetic. You just want to look better. For others the motivation for losing weight is better health. Obesity is a killer. It is associated with increased risk of diabetes, heart attack and stroke – and that’s just the tip of the iceberg.

But what if your bathroom scale says that you are normal weight? Are you off the hook? Maybe not. A recent study suggests that if you are normal weight but have central obesity (a fancy scientific term for belly fat), you are more likely to die prematurely than someone with normal fat distribution regardless of how overweight they are. That’s a pretty scary thought. It has even generated a new risk category called “skinny fat”.

How Can You Be Obese Without Being Overweight?

In recent years there has been some controversy about the health risks of obesity. Part of that controversy has arisen because obesity can be defined in multiple ways. Most of us simply hop on the scale and rely on actuarial tables to tell us what a healthy weight is for our height. Scientists, on the other hand use two very different measures of obesity.

#1 is Body Mass Index or BMI.BMI is a person’s weight in kilograms (kg) divided by his or her height in meters squared. By this measure:

  • Normal body weight is defined as a BMI of 18.5-24.9 kg/m2.
  • Overweight is defined as a BMI of 25-29.9 kg/m2.
  • Obesity is defined as a BMI of ≥30 kg/m2.

#2 is waist to hip ratio or WHR. WHR is a measure of central adiposity (belly fat). By this measure:

  • Obesity is defined as excess central adiposity (excess belly fat), which is a waist to hip ratio ≥0.85 in women and ≥0.90 in men.

In general BMI and WHR correlate. However:

  • 11% of men and 3.3% of women are normal weight according to BMI measurements, but have excess belly fat according to WHR measurements.These are the individualswho are obese according to their WHR measurements without being overweight according to their BMI measurements. These are the individuals often referred to as “skinny fat”.
  • There are similar percentages of men and women who are overweight or obese according to BMI measurements, but have low WHR measurements. These are often referred to as “pear shaped” obese individuals to distinguish them from the “apple shaped” obese individuals with a lot of belly fat.

Being Skinny Fat Can Kill You

obesity vs. overweightNumerous studies have shown that “apple shaped” obesity is much more likely to be associated with disease and premature death than “pear shaped” obesity, but there have been very few studies comparing health outcomes for normal weight individuals who have excess belly fat (people who are “skinny fat”) with health outcomes of overweight and obese individuals. This study (Sahakyanet al, Annals of Internal Medicine, 2015 Nov 10 doi: 10.7326/M14-2525) was designed to fill that void.

These scientists analyzed data from the National Health and Nutrition Survey III (NHANES III). NHANES III collected BMI, WHR and health data from 15,184 Americans (52.8% women) aged 18 to 90 years (average age 45) and followed the study participants for 14.3 years. By that time 3222 of them had died, with 1413 of those deaths being due to heart disease. The results were enlightening:

  • Normal weight individuals with excess belly fat (“skinny fat” individuals) were 1.5 – 2.0 fold more likely to die during the 14.3 year follow up period than individuals who were normal weight and had little belly fat (“skinny lean” individuals). This was expected because this had been shown in several previous studies.
  • However, the surprising finding was that normal weight individuals with excess belly fat were also more likely to die than individuals who were overweight or obese. Specifically:
  • Men who were “skinny fat” were 2.2 – 2.4 fold more likely to die prematurely than men who were either overweight or obese, but did not have excess belly fat (men with a “pear shaped” fat distribution). “Skinny fat” women were 1.3 – 1.4 fold more likely to die prematurely than overweight or obese women with “pear shaped” fat distribution.
  • Men who were “skinny fat” were even slightly more likely to die prematurely than overweight or obese men with excess belly fat (men with “apple shaped” fat distribution). “Skinny fat” women were just as likely to die as overweight or obese women with “apple shaped” fat distribution.
  • When they looked at deaths due to cardiovascular disease the results were essentially the same.
  • These results were novel and should, perhaps serve as a wake-up call for normal weight individuals with excess belly fat.

The authors concluded:

  • “Our analysis of data…show that normal-weight U.S. adults with central obesity [excess belly fat] have the worst long-term survival compared with participants with normal fat distribution, regardless of BMI category.”
  • “To our knowledge, our study is the first to show that normal-weight central obesity, measured by WHR, is associated with an increased risk of cardiovascular mortality.”
  • “Our findings suggest that persons with normal-weight central obesity may represent an important target population for lifestyle modification and other preventative strategies.”

Why Is Being Skinny Fat So Dangerous?

health riskAs the authors of this study pointed out, it is well established that excess belly fat is associated with:

  • Insulin resistance, which can lead to diabetes and predispose to heart disease.
  • High triglycerides and high levels of “bad” cholesterol, which can lead to heart disease.
  • Inflammation, which can lead to a number of deadly diseases.

The metabolic effects of excess belly fat are sufficient to explain why someone who is “skinny fat” is more likely to die prematurely than someone who is “skinny lean”. However, the effect of excess belly fat is not sufficient by itself to explain why a “skinny fat” individual is more likely to die prematurely than someone who is overweight or obese.

To understand this we need to recognize that both fat and muscle contribute to body weight (and to BMI). The “skinny fat” individual has more fat mass AND less muscle mass than a “skinny lean” individual of the same weight. That is a huge factor because metabolically speaking muscle is protective. It opposes all of the bad metabolic effects of belly fat.

Simply put, being “skinny fat” is extremely dangerous because you have increased all the bad metabolic effects of excess belly fat, ANDyou have decreased the protective metabolic effect of muscle mass.

How Do You Go From Being “Skinny Lean” To “Skinny Fat”?

Most of us were lean in our younger years. For those of us who end up as “skinny fat” as we age, it is pretty obvious that there are two processes going on simultaneously.

#1: Loss of Muscle Mass:It would be easy to say that becoming “skinny fat” is a natural part of aging. The natural tendency is to loose muscle mass and replace it with fat mass as we age. If we “just go with the flow” all of us will end up being “skinny fat” at some point. However, the loss of muscle mass as we age is accelerated by our sedentary lifestyle and our diet (more on that below).

#2: Gain of Belly Fat:To some extent whether we store excess fat as “pears” or “apples” is genetically determined. However, what we eat can also exert a major influence. For example:

  • Alcohol: The term “beer belly” says it all. Excess alcohol consumption is associated with an increase in belly fat. Once you understand the metabolism of alcohol the explanation is pretty simple. Alcohol causes blood sugar to drop, which increases appetite. Alcohol also interferes with our judgement, which can cause us to make poor food choices.
  • Excess saturated fat tends to be stored preferentially as belly fat.
  • Excess sugars and simple carbohydrates are rapidly converted to fat stores and stored as belly fat.

What Can You Do If You Are Already Skinny Fat?

gain muscle massLet’s start with what you shouldn’t do. You should not go on a reduced calorie weight loss diet to get rid of your excess belly fat. The last thing you want to do is to end up being underweight with excess belly fat! Here is what you should do:

#1: Increase Your Muscle Mass:I said that loss of muscle mass was a natural part of aging. I didn’t say that it was an inevitable part of aging. If you want to prevent or reverse loss of muscle mass you need to:

  • Get really serious about exercise. I’m talking about 30 minute workouts at least 3-5 times per week. These workouts need to include strength training as well as aerobics and flexibility exercises. I would suggest you ask your health professional what kind of exercise program is best for you and start your exercise program under the guidance of a personal trainer or physical therapist.
  • Make sure that your diet contains enough protein and enough of the essential amino acid leucine to maximize the gain of lean muscle mass following your workouts. I have covered the latest age-appropriate recommendations in, leucine and muscle gain, a previous “Health Tips From The Professor.”

#2: Lose Your Belly Fat:To some extent you will start to lose your belly fat naturally if you follow the recommendations above. In addition, you will want to:

  • Drink alcohol in moderation.
  • Make food choices that allow you to replace saturated fat with monounsaturated fat and polyunsaturated fats, especially the omega-3 polyunsaturated fats.
  • Replace excess sugars and simple carbohydrates with complex carbohydrates from fresh fruits and vegetables along with modest amounts of whole grain foods.

The Bottom Line

  • A recent study has shown that being “skinny fat” (having normal body weight, but excess belly fat) is more likely to result in premature death than if you were overweight, or even obese.
  • The most likely explanation for this alarming statistic is that someone who is “skinny fat” has excess belly fat, which predisposes to a number of diseases, and a loss of muscle mass, which protects against those same diseases.
  • If you are overweight or obese, you need to reduce your caloric intake to lose weight. However, if you are “skinny fat”, you don’t want to reduce your caloric intake. You need to change your exercise and diet habits.
  • Loss of muscle mass and gain of fat mass is a normal part of aging. However, you can slow or reverse the age-related loss of muscle mass with an exercise program and enough protein and leucine in your diet to maximize the effects of that workout program (details above).
  • You can prevent or get rid of excess belly fat by:
  • Following the exercise program and nutritional support of that exercise program described above.
  • Making food choices that replace saturated fats with monounsaturated fats and polyunsaturated fats, especially omega-3 polyunsaturated fats.
  • Replacing foods high in sugar and simple carbohydrates with fresh fruits and vegetables and whole grains in moderation.

 

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure or prevent any disease.

Is Vitamin E Deficiency Common in the US

 Does Vitamin E Matter?

Author: Dr. Stephen Chaney

are Americans vitamin E deficientA headline claiming “Over 90% of Twentysomethings Have Suboptimal Vitamin E Status” caught my eye the other day, so I decided to investigate further. If you have been following all of the information and misinformation about vitamin E in the online media, you are probably confused – and this headline just adds to the confusion. There are probably three basic questions you want answered:

  • Is the latest study valid? Are most Americans vitamin E deficient?
  • Does it matter? Vitamin E has been described as “a vitamin in search of a disease”. If there are no diseases associated with vitamin E deficiency, should we even be concerned if most Americans are vitamin E deficient?
  • Is there any value to vitamin E supplementation? You will see claims that vitamin E supplementation has been proven not to work. Are these claims valid?

Let me guide you through the maze. I will start by analyzing the study behind the current headlines.

Are Americans Vitamin E Deficient?

is vitamin e deficiency common in the usThe best food sources of vitamin E are nuts, seeds and unrefined vegetable oils, followed by green leafy vegetables. Since these foods are not abundant in the American diet, it is no surprise that previous studies have shown that 83% of US children and 91% of US adults do not consume the recommended 12 mg/day of vitamin E. Consequently, the 2015 Dietary Guidelines Advisory Committee identified vitamin E as a “shortfall nutrient”.

This study (McBurney et al, PLoS One 10(8): e0135510 doi: 10.1371/journal.pone.0135510) took the next logical step by asking whether the inadequate intake of vitamin E lead to inadequate blood levels of the vitamin. The authors analyzed data from 7,922 participants who had their blood levels of alpha-tocopherol (the most abundant form of vitamin E) determined in the 2003-2006 National Health and Nutrition Examination Survey (NHANES).

They subdivided participants into those who used no supplements (4049) and those who used supplements (3873). (Note: The supplement users were not necessarily using vitamin E supplements, but many were using a multivitamin supplement containing vitamin E). The authors compared the study participant’s blood levels of vitamin E with the Institute of Medicine standard for vitamin E deficiency (12 umol/L) and with a standard they set for adequate vitamin E levels (30 umol/L). Here are the results of their analysis:

  • People who did not use supplements had lower blood levels of vitamin E (24.9 umol/L) than those who used supplements (33.7 umol/L). No surprise here.
  • Only 0.6% of Americans were clinically deficient in vitamin E (blood levels < 12 umol/L). The prevalence of vitamin E deficiency did not vary significantly with age, gender or ethnicity.
  • When they looked at the people not using supplements, the percentage with suboptimal vitamin E status (blood levels < 30 umol/L) varied significantly by age, but was not significantly affected by gender or ethnicity. In this analysis the percentage with suboptimal vitamin E status was:
  • 7% for ages 20-30.
  • 8% for ages 31-50
  • 2 % for ages 51 and above

Were The Headlines Correct?

newspaper heallinesTechnically speaking the headlines were correct. 92.7% of Americans aged 20-30 who used no supplements had suboptimal blood levels of vitamin E as defined in this study. When you combined both supplement users and non-users, the percentage with suboptimal blood levels of vitamin E was only slightly less (87.4%). However, there are a couple of important caveats:

  • There is no internationally recognized standard for adequate blood levels of vitamin E. The authors had a reasonable rationale for choosing 30 umol/L as their standard for adequate blood levels, but they also acknowledged that the Estimated Average Requirement of vitamin E from food (12 mg/day) would result in a blood level of 27.9 umol/L, so their standard may be a bit high.
  • The average blood level of vitamin E for non-supplement users was 24.9 umol/L. While that is less than adequate, it is only slightly low – especially if the lower standard of 27.9 umol/L is used.

I think it would be more accurate to say that a large percentage of Americans have blood levels of vitamin E that are slightly below what is considered adequate but are far above what could be considered clinically deficient. The question then becomes “Does it matter?”

Does Vitamin E Matter?

Let me start with a little perspective. In the United States diseases like scurvy, pellagra and beriberi are things of the past. We simply don’t see deficiency diseases anymore. What we do see are intakes of essential nutrients that are slightly below optimal. Vitamin E is no different.

If we focus on suboptimal nutrient intake by itself, the answer would probably be that it doesn’t matter. Suboptimal nutrition is seldom enough to cause poor health by itself.

However, we also need to take into account individual differences that affect the need for essential nutrients. Poor health is much more likely to arise when suboptimal intake of one or more essential nutrients is coupled with increased needs due to genetic predisposition, risk factors that predispose to disease, and/or pre-existing disease.

With this perspective in mind, we are ready to ask whether suboptimal intake of vitamin E or any other essential nutrient matters. The answer is pretty simple. It doesn’t matter for everyone, but it matters very much for those individuals with increased needs.

If we had a good way of assessing individual nutritional needs, it would be easy to say who needed supplements and who didn’t. The problem is that we currently have no good way of assessing individual needs for essential nutrients. We simply cannot predict who will and who won’t be affected by suboptimal nutrient intake. That is why millions of Americans take supplements on a daily basis.

Is There Any Value To Vitamin E Supplementation?

vitamin e supplementationThat brings us to the final question. Is vitamin E supplementation a waste of money? You’ve probably already heard that most studies have failed to show any benefit from vitamin E supplementation, but you may be asking “How can that be when we also know that most Americans are getting suboptimal levels of vitamin E in their diet?”

With the perspective I described above in mind, the answer is pretty simple. Those studies have been asking the wrong question. They have been asking whether vitamin E supplements benefit everyone. They haven’t asked whether vitamin E supplements benefit people with increased needs.

When you ask that question the answer is very different. Let me give you three examples – one representing each of the kinds of increased need I described above:

  • In the Women’s Health Study (JAMA, 294: 56-65, 2005) vitamin E supplementation had no effect on heart attack or stroke in the general population. But when they looked at women over 65 (those at highest risk for heart disease), vitamin E supplementation reduced heart attack and stroke by 25% and cardiovascular deaths by 49%
  • In the Heart Outcome Prevention Evaluation Study (Diabetes Care, 27: 2767, 2004; Atherosclerosis, Thrombosis & Vascular Biology, 24: 136, 2008) vitamin E supplementation had no effect overall on heart attacks or cardiovascular deaths. But when they looked at a population who had a haptoglobin genotype that significantly increases the risk of heart disease, vitamin E supplementation significantly decreased the risk of both heart attacks and cardiovascular deaths.

 

The Bottom Line

  • Recent headlines saying that over 90% of young Americans have suboptimal vitamin E status are technically correct, but a bit overstated. It probably would have been more accurate to say that most Americans have slightly suboptimal vitamin E status.
  • The important question then becomes “Do marginal nutritional deficiencies matter?” The answer is pretty simple. Marginal nutritional deficiencies do not matter for everyone. However, they matter very much for those people who have increased needs for that nutrient due to genetic predisposition, risk factors for disease or pre-existing disease.
  • If we had a good way of assessing individual nutritional needs, it would be easy to say who needed supplements and who didn’t. However, we don’t have a good way of assessing increased needs for most nutrients, which is why many Americans use supplements on a daily basis.
  • As for all of those studies saying that vitamin E supplementation has no benefit, they are a bit misleading because they are asking the wrong question. They are asking whether vitamin E supplementation benefits everyone. They are not asking whether vitamin E supplementation benefits people with increased needs. When you ask that question the answer is very different (see examples in the article above).

 

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure or prevent any disease.

Soy and Hot Flashes

Will Soy Put Out The Fire?

Author: Dr. Stephen Chaney

 

soy and hot flashesThere has been a lot of controversy in recent years about soy and hot flashes. The question is whether soy isoflavones reduce the hot flashes associated with menopause.

And this is an important question! Because of concerns about increase heart attack risk with hormone replacement therapy (HRT) many women have been looking for natural alternatives to HRT for reducing hot flashes and other symptoms of menopause. They’ve been asking whether soy isoflavones are effective, and the answers that they’ve been getting have been confusing.

For example, you can still find many experts and health professionals who will tell you that soy isoflavones have no proven effect on menopause symptoms.

That is somewhat surprising since two recent meta-analyses (Howes et al, Maturitas, 55: 203-211, 2006; Williamson-Hughes, Menopause, 55: 203-211, 2006) and a 2010 expert panel of The North American Menopause Society have all concluded that soy isoflavones alleviate hot flashes.

Will Soy Put Out The Fire?

However, clear guidance in this area was sorely needed, so Taku et al (Menopause, DOI: soy10.1097/gme.0b013e3182410159, 2012) performed an even larger meta-analysis that included 19 published clinical trials – some of which had been published after the previous two meta-analyses were performed.

I’ve talked about meta-analyses before, so you probably already know that they are very powerful because they combine the results of many individual clinical trials into a single data analysis.

But you also may remember me telling you that meta-analyses can be misleading if they introduce bias because of the kinds of clinical studies that they exclude from their analysis.

So I examined the design of this meta-analysis very carefully. It excluded clinical trials that:

  • were not double blind, placebo controlled and designed in such a manner that the placebo was indistinguishable from the soy isoflavone preparation.
  • contained other substances in addition to the soy isoflavones (The presence of other substances in the preparation might have influenced the response).

There were several other well justified reasons for excluding some studies from the meta-analysis, but they were technical in nature. In my opinion this was a very well designed study.

And the results were clear cut. An average of 54 mg of soy isoflavones (some studies used a little less, some a little more) was sufficient to reduce:

  • the frequency of hot flashes by 21% – and –
  • the severity of hot flashes by 26%

Soy and Hot Flashes: What This Study Mean For You?

The results of this study were highly statistically significant. So if you are suffering from hot flashes and are wondering whether soy isoflavones will put out the fire, the answer appears to be YES.

That’s the good news.

The bad news is that 21-26% is not a huge effect.

And, if you look at the individual clinical studies it is apparent that the response is highly variable. Some women experience major relief from hot flashes and other menopause symptoms, while other women experience little or no relief.

The reason for this variability is not known, but it is likely that the effectiveness of soy isoflavones on reducing hot flashes is modified by other components of the diet and by lifestyle factors such as obesity, exercise and stress.

Soy and hot flashes; the bottom line.

My take on this is that soy isoflavones should not be thought of as a “magic bullet” that will make hot flashes go away by themselves, but rather as a proven part of a holistic approach that encompasses a healthy diet, exercise, weight control and stress reduction

The Bottom Line

  • A recent meta-analysis of 19 published clinical studies showed that soy isoflavones reduced the frequency of hot flashes by 21% and the severity of hot flashes by 26%.
  • The results were highly statistically significant, but 21-26% reduction in symptoms is not a huge effect.
  • When they looked at the individual clinical studies it was apparent that the response is highly variable. Some women experienced major relief from hot flashes and other menopause symptoms, while other women experienced little or no relief.
  • My take on this is that soy isoflavones should not be thought of as a “magic bullet” that will make hot flashes go away by themselves, but rather as a proven part of a holistic approach that encompasses a healthy diet, exercise, weight control and stress reduction

 

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure or prevent any disease.

Iron and Brain Development

Iron and the Teen Brain
Author: Dr. Steve Chaney

 

iron and brain development in teensFor those of you with teenagers – or who have had teenagers in the past – you may suspect that there’s nothing between their ears. But actually there is a lot going on between their ears, and some of the neural contacts laid down in the brain during the teen years influence the health of their brain during their adult life.  Let’s look at the association between iron and brain development.

And – no surprise here – what they eat can affect the health of their brain as well.

Which brings me to a study published in the Proceedings of the National Academy of Sciences January 9, 2012 (doi: 10.1073/pnas.1105543109) that looks at the adequacy of dietary iron intake during the teenage years and their brain health as adults.

Basics of Iron Metabolism

Before I describe the study perhaps a little bit of what I call Biochemistry 101 is in order.

Free iron is toxic to living cells. For that reason, our body produces multiple proteins to bind and transport the iron. The protein that binds and transports iron through the bloodstream is called transferrin. Under normal conditions 2/3 of the transferrin in our bloodstream has iron bound to it and 1/3 does not. And that is the ideal ratio of bound and unbound transferrin for delivery of iron to brain cells and other cells in our body.

When our diet is iron deficient (or we have excessive blood loss) the percent iron saturation of transferring decreases. The body tries to compensate by producing more transferrin, but this doesn’t really help since the problem was inadequate iron supply, not inadequate transferrin supply. As a consequence elevated transferrin levels are generally indicative of an iron-deficient diet.

Iron and Brain Development in Teens

iron and brain developmentThe study was led by Dr. Paul Thompson of the UCLA Department of Neurology. He and his team performed brain scans on 631 healthy young adults with an average age of 23. The brain scans were of a type that measured strength and integrity of the connections between the nerves in the brain – in other words, the brain’s wiring. They then went back and looked at the amount of iron available to each subject’s brain during adolescence by looking at their blood transferrin levels from routine physical exams performed at ages 12, 14 and 16 (blood transferrin levels are often measured as part of routine physical exams).

The results were pretty clear cut. Elevated transferrin levels during the teenage years were associated with reduced brain-fiber integrity in regions of the brain that are known to be vulnerable to neurodegeneration. These individuals did not show any cognitive impairments as young adults, but the concern is that they might be more likely to develop cognitive impairments as they age.

Dr. Thompson summarized his team’s findings by saying that “Poor iron levels in childhood erode your brain reserves which you need later in life to protect against aging and Alzheimer’s. This is remarkable, as we were not studying iron deficient people, just around 600 normal healthy people. It underscores the need for a balanced diet in the teenage years, when your brain command center is still actively maturing.”

Questions Every Parent Should Ask

If you have teenagers you might want to ask yourself questions like:

  • What is your teenager’s diet like?
  • Is it balanced?
  • Are you sure that it meets their nutritional needs?
  • Should you consider supplementation to make sure that they are getting all of the nutrients that they need?

 

The Bottom Line

  • A recent study suggested that inadequate iron intake in the teenage years may affect how our brains are wired in our adult years. The authors of the study interpreted the study as suggesting that an inadequate diet during the teen years could predispose us to cognitive decline and Alzheimer’s as adults.
  • This study only looked at structural differences in the brain circuitry. We can’t conclude from this study alone that inadequate iron intake as a teenager will doom somebody to cognitive impairment and increased Alzheimer’s risk as they age. But we can conclude that adequate iron intake during adolescence is required for normal brain development.
  • And it’s probably not just iron. This study focused on iron status because transferrin levels are routinely measured during physical exams so it was easy to go back and determine what each subject’s iron status was during their teenage years. Many other important nutrients are required for normal brain development, but we don’t have an easy way of going back and determining what someone’s nutritional status was for those nutrients in their teen years. What was shown to be true for iron in this study is likely to be true for other nutrients as well.
  • These were normal teens eating a normal American diet. They weren’t from a third world country and there was nothing weird about what they were eating. But, clearly some of the subjects in the study weren’t getting the iron that they needed from diet alone.
  • The teen years are a time of rapid growth and maturation. It’s not just the brain that needs the proper balance of nutrients during the teen years. All of their tissues require proper nutrition.

 

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure or prevent any disease.

Are All Calories Created Equal?

Are Food Choices More Important Than Calories?Author: Dr. Stephen Chaney

 

Most adult Americans gain a pound or two each year. That may not sound like much on a yearly basis, but over a lifetime it is huge – if you’ll pardon the pun.

are all calories created equalBecause the health consequences of weight gain are so devastating, everyone has their favorite dietary advice for keeping those extra pounds away. For some it is diet plans – low fat, low carb, paleo, Mediterranean – you name it. For others it is counting calories or avoiding sugars of all kinds. The list goes on. Are all calories created equal?

But what if all of these approaches were wrong? What if we could keep our weight under control solely based on the foods we eat? A recent study seemed to suggest that we just might.

How Was The Study Designed?

A group of scientists from Tufts University and Harvard decided to look at how the food choices we make on a daily basis influence our weight gain or loss over time (Smith et al, AJCN 101: 1216-1224, 2015). However, they designed their study in a very unique way, and it is important that I explain the study design so that you can understand the strengths and limitations of the study.

Most studies of this kind look at what foods people are eating and compare that to how much they weigh. These scientists looked at changes that people made in their diets and correlated that with how much weight they gained or lost over time.

When you think of it, that’s the information most of us really want to know. We are less interested in why the foods we used to eat got us into trouble in the first place than we are in how the changes we make in our diet might influence future weight loss or gain.

This study combined the data from three very large, long term studies – the Nurses’ Health Study, the Nurses’ Health Study II, and the Health Professionals Follow-Up Study. Altogether that is a group of 120,784 men and women who were followed for 16-24 years. All three of these studies measured weight and evaluated dietary habits using food-frequency questionnaires every 4 years.

The scientists conducting the study measured changes in food choices and changes in weight for each individual in 4-year increment over the total time duration of the studies. In analyzing the data, they looked at choices of protein foods, total carbohydrate, and the glycemic load (GL) of the carbohydrates.

Glycemic load is the glycemic index (effect on blood sugar) of the carbohydrates in a particular food times the total amount of carbohydrate in that food. You can think of glycemic load as a measure of carbohydrate quality. For example, white bread, pastries, muffins, pancakes, white rice, chocolates, candy bars, cookies, brownies, cakes, pies, and pretzels would all be examples of foods with a high glycemic load. Fruits, whole grain foods and starchy vegetables would be examples of foods with a moderate glycemic load. Vegetables and beans would be examples of foods that generally have a low glycemic load.

The authors of the study did not measure calories or fat intake for this study, but those factors are indirectly included in food choices – more about that later.

Are All Calories Created Equal?

Now let’s get to the good stuff – the results of this study. When the authors analyzed the data they found that:

  • Most of the subjects did not exchange one protein food for another over the course of the study. They exchanged protein foods for carbohydrate-rich foods and vice versa.

This was a surprise. Since many experts have been recommending that people substitute chicken and fish for red meat, they had expected to see that kind of dietary shift when they analyzed the data. Apparently, people have not been listening to the experts!

  • high carbohydratesWhen the subjects replaced a serving of carbohydrate-rich foods with a serving of red meats, processed meats, chicken with skin or most cheeses they gained between 0.5 to 2.3 pounds per year. Within this category the greatest weight gain was seen when hamburgers were substituted for carbohydrates, and the least weight gain was seen when cheese was substituted for carbohydrates.
  • When the subjects replaced a serving of carbohydrate-rich foods with a serving of milk, peanuts or eggs there was no net change in weight. These appear to be substitutions that are good for weight maintenance.
  • When the subjects replaced a serving of carbohydrate-rich foods with a serving of yoghurt, peanut butter, beans, walnuts, other nuts, chicken without skin, low-fat cheese or seafood they lost between 0.5 and 1.5 pounds/year. Within this category the greatest weight loss was seen when plain yoghurt was substituted for carbohydrates, and the least weight loss was seen when low-fat cheese was substituted for carbohydrates.
  • When they focused on carbohydrate-rich foods, replacing one serving of high glycemic load foods with low glycemic load foods was associated with one pound of weight loss per year. Simply put, if you switch from cookies, pastries and candies to fruits and vegetables, you are likely to lose weight. No surprise here.

The study really got interesting when they looked at the effect of adding different proteins in the context of the carbohydrate-rich foods that the subjects were eating. For example,

  • When the subjects added a serving of red meat to a diet containing carbohydrate foods with a high glycemic load, they gained an average of 2.5 pounds per year. When they added that same serving of red meat to a diet containing carbohydrate foods with a low glycemic index, they gained only around 1.5 pounds per year.

Simply put, that means eating a hamburger on a white flour bun with fries is going to pack on more pounds than a hamburger patty with brown rice and a green salad.

  • The effect of glycemic load was particularly interesting when you looked at the protein foods that were good for weight maintenance overall. For example, adding a serving of eggs to a high glycemic load diet resulted in a 0.6 pound/year weight gain, while adding that same serving of eggs to a low glycemic load diet resulted in a 1.75 pound/year weight loss. The results were similar for cheeses.
  • Finally, glycemic load also influenced the effectiveness of protein foods associated with weight loss. For example, addition of a serving of beans to a high glycemic load diet resulted in 0.5 pound/year weight gain loss, but adding a serving of beans to a low glycemic load diet resulted in a 1.5 pound/year weight loss.

New Insights From This Study

good proteinThis study broke new ground in several areas. For example,

  • We have heard over and over that substituting beans, chicken and fish for red meats is healthier. This is the first study I have heard of that says those same substitutions can prevent or reverse weight gain.
  • Many people advocate a high protein diet for weight control or weight loss, but many of them will tell you the type of protein doesn’t matter. This study suggests that the type of protein foods we eat are important in determining whether we lose or gain weight.
  • Everyone knows that switching from white grains, pastries and candy to whole grains, fruits and vegetables will help you lose weight, but this is the first study I’m aware of that suggests those same changes will influence whether the protein foods we eat lead to weight gain or weight loss.
  • Many people focus on fats and calories when trying to avoid weight gain. While this study is not really fat and calorie neutral (see below), it does suggest that if we focus on eating healthy foods we don’t need to be counting every fat gram and every calorie.
  • Finally, this study suggests that if we forget all of those crazy diets and focus on eating healthy foods, our weight will take care of itself. Not exactly a novel concept, but one worth repeating.

Limitations of the Study

The head author of this study stated in an interview “The idea that the human body is just a bucket for calories is too simplistic. It’s not just a matter of thinking about calories or fat. What’s the quality of the foods we are eating? And how do we define quality.” This has been picked up by the media with statements like “not all calories are created equal”.

That is a bit of a hyperbole, because this study is not really fat and calorie neutral. The protein foods (red and processed meats) that pack on the most calories are higher in fat and calories per serving than those protein foods (skinless chicken, fish and beans) that cause the least weight gain. Similarly, the carbohydrate foods with the highest glycemic load (pastries, cakes and candy) are higher in fat and calories per serving than those carbohydrate foods with the lowest glycemic load (fruits and vegetables).

The real message is not that fat content and calories don’t count. Nor is it that calories in some foods count more than the same calories in other foods. The take home lesson from this study should be that we don’t have to focus on fat and calories. If we focus on healthy foods, the fat and calories tend to take care of themselves.

But, even that message is a bit too simplistic. Choosing healthy foods is not all that there is for weight control. We also need consider:

  • Portion sizes. Half a chicken could easily add more calories than a small hamburger.
  • How the food is cooked. Fish cooked in a cream sauce may not be any better for weight control than a slab of red meat.
  • Exercise. We need to maintain muscle mass to keep metabolic rate high.

 

The Bottom Line

  • A recent study has broken new ground and provided some new insights into how to prevent those extra pounds from sneaking up on us over time. This study evaluated how some simple changes we could make in the foods we eat can influence whether we gain or lose weight.
  • One part of the study looked at the effects of replacing a serving of carbohydrate rich foods with a serving of protein rich foods. If that protein rich food were a hamburger, we could expect to gain about 2.3 pounds/year. If that protein rich food were seafood, we could expect to lose about 1.5 pounds/year. Other protein foods fall in between those extremes. The specifics are covered above.

This a new insight. Many people advocate a high protein diet for weight control or weight loss, but many of them will tell you the type of protein doesn’t matter. So, are all calories created equal?  This study suggests that the type of protein foods we eat are important in determining whether we lose or gain weight.

  • Another part of the study looked at the effect of different carbohydrate foods based on their glycemic load (the effect they have on blood sugar). Simply replacing 1 serving of high glycemic load foods (refined grain foods, cookies, cakes, candy) with low glycemic load foods (whole grains, fruits and vegetables) was associated with a one pound/year weight loss. This should surprise no one.
  • Finally, one part of the study looked at the influence of glycemic load on the effect that various proteins have on weight gain or loss. For example, adding a serving of eggs to a high glycemic load diet resulted in a 0.6 pound/year weight gain, while adding that same serving of eggs to a low glycemic load diet resulted in a 1.75 pound/year weight loss. Other examples are given above.

This is also a new insight. Everyone knows that switching from white grains, pastries and candy to whole grains, fruits and vegetables will help you lose weight, but this is the first study I’m aware of that suggests those same changes will influence whether the protein foods we eat lead to weight gain or weight loss.

  • Some in the media have interpreted this study as saying that fat and calories don’t count. However, this study was not really fat and calorie neutral. The protein and carbohydrate rich foods that packed on the most calories were also the foods highest in fat and calories. The real take home message from this study is that we may not need to focus so much on fat and calories. When we focus on eating healthy foods the fat and calories tend to take care of themselves.
  • Even that message is a bit too simplistic. It is not enough to just focus on healthy foods. We need to consider things like portion size, how the food is prepared, and our exercise habits among other things.

 

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure or prevent any disease.

What Is Epigenetics

Can What We Eat Affect Our Kids?

Author: Dr. Stephen Chaney

 

what is epigeneticsWhat is epigenetics?  For me, the first stages of understanding came a while back.  When I was a young graduate student (which is more than just a few years ago), I was taught that all genetic information resided in our DNA. During conception, we picked up some DNA from our dad and some from our mom, and that DNA was what made us a unique individual.

We knew that environmental influences such as diet, lifestyle and exposure to toxic chemicals could affect our health personally. However, we never dreamed that the effects of those environmental influences could actually alter our gene expression, and that those genetic alterations could be passed on to our children.

Today we know that environmental influences can actually modify our DNA and that those modifications can be passed on to our offspring – a process called epigenetics.

What Is Epigenetics & How Does It Affect Gene Expression?

Simply put, epigenetics involves modifications to our DNA. DNA can be methylated or acetylated and the proteins that bind to our DNA can be modified in multiple ways. That is important for two reasons:

  • These alterations can turn genes on and off. That means that epigenetic modifications can alter gene expression.
  • These alterations can be influenced by our environment – diet, lifestyle, and exposure to environmental chemicals

In a previous “Health Tips From the Professor” article titled “Can Diet Alter Your Genetic Destiny?”  I discussed recent research suggesting that a healthy diet and lifestyle causes epigenetic changes in the DNA that may reduce your risk of heart disease, cancer and diabetes.

That alone was a monumental discovery. Even more monumental is the recent discovery that at least some of those epigenetic changes can be passed on to our children, which brings me to the question I posed in the title of this article: “Can what we eat affect our kids?”

Animal Studies Showing That Epigenetic Changes Can Be Inherited

epigenetic changes are inheritedAs is often the case, the first definitive study showed that epigenetic changes were heritable was an animal study. This study was done with a mouse strain called agouti (Waterman and Jirtle, Mol. Cell. Biol. 23: 5193 – 5300, 2003). Agouti mice can have two remarkably distinctive phenotypes. They can either have a yellow coat, become obese as adults and be prone to cancer and diabetes as they age or they can have a brown coat and grow up to be lean and healthy.

It had been known for some time that these phenotypic differences were controlled by the epigenetic methylation of a specific gene called the agouti gene. The agouti gene codes for a genetic regulator that controls coat color, feeding behavior, and body weight set-point, among other things. When the agouti gene is under methylated it is active. As a consequence the mice have yellow coats and are prone to obesity. When the agouti gene is highly methylated it is inactive. The mice have brown coats and are lean and healthy.

Moreover, methylation of the agouti gene is not a purely random event. Mothers with the yellow, obese phenotype tended to produce a preponderance of offspring with the same phenotype and vice-versa. In short, the epigenetic methylation pattern of the agouti gene could be passed from generation to generation. It was heritable.

Waterman and Jirtle’s research broke new ground by showing that the methylation of the agouti gene could be strongly influenced by what the mother ate while the fetal mice were still in the womb.

When they fed agouti mothers a diet with extra folic acid, B12, betaine and choline (all nutrients that favor DNA methylation) during conception and pregnancy the agouti gene of their offspring became highly methylated. A high percentage of those offspring had brown coats and grew up to be lean and healthy.

However, when Waterman and Jirtle put agouti mothers on a diet that was deficient in folic acid, B12, betaine and choline during conception and pregnancy the agouti gene of their offspring was under methylated. Many of those offspring had yellow coats and grew up to be fat and unhealthy.

Subsequent studies from the same laboratory have shown that:

  • Addition of genistein, a phytonutrient from soy, to the maternal diet also favors methylation of the agouti gene and protects against obesity in agouti mice (Dolinoy et al, Environmental Health Perspective, 114: 567-572, 2006).
  • The addition of the environmental toxin bisphenol A to maternal diets causes under methylation of the agouti gene and predisposes to obesity in agouti mice, but this effect can be reversed by also feeding the mother genistein or folic acid and related nutrients during pregnancy (Dolinoy et al, PNAS, 13056-13061, 2007).

The agouti mice studies provide a dramatic example of how diet and environmental exposure during pregnancy can cause epigenetic changes in fetal DNA that have long term health consequences for the offspring. However, they are animal studies. Does the same hold true for humans?

Diet, Epigenetic Changes, and Obesity in Humans

diet-epigenetic-changes-obesityWith humans, it is really difficult to determine whether epigenetic changes that occur during conception and pregnancy affect our children. That is because when you measure an epigenetic effect in a child or adult, it is difficult to sort out how much of that effect was caused by what the mom ate during pregnancy and how much was caused by how the family ate as the kids were growing up.

Unfortunately, there is a tragic human experiment that shows that the same kind of epigenetic changes are heritable in humans. I’m referring to what is known as the “Dutch Hunger Winter”. This was a period of starvation during 1944-1945, the final year of World War II, when the Germans set up a blockade that prevented food from reaching western Holland. During that few months even pregnant women were forced to live on food rations providing a little as 500 calories a day.

This was an event without parallel in human history. Holland is not a third world country. Once the blockade was lifted children born during the Hunger Winter had the same plentiful supply of food as every other Dutch citizen. This has allowed generations of research scientists to ask what were the effects of a brief exposure to malnutrition during conception and pregnancy.

The health consequences were dramatic. 50 years later individuals who were conceived during the Hunger Winter weighed about 14 pounds more, had waists about 1.5 “ larger, and were three times more likely to have heart disease than those born to mothers who were in their second or third trimester of pregnancy during that time. By the time they reached age 63, they experienced a 10% increase in mortality.

What caused those health consequences? Could the cause have been epigenetic? Recent research suggests that the answer might be yes.

A recent study analyzed epigenetic changes in DNA from blood samples of survivors born during the Hunger Winter that had been collected when they were 59 years old (Tobi et al, Int. J. Epidemiology, doi: 10.1093/ije/dyv043, 2015). This study showed:

  • A distinct pattern of DNA methylation was observed in survivors who were conceived during the Hunger Winter. This pattern of DNA methylation was not observed in survivors who were in their second or third trimester during the Hunger Winter. It was also not seen in people who were conceived immediately before or after the Hunger Winter.
  • Some of the genes with distinctive methylation patterns were genes that affected things like cholesterol levels and insulin sensitivity, which have the potential to increase disease risk.
  • Other genes with distinctive methylation patterns were genes that affected metabolism. They were “thrifty” genes that increased the efficiency of metabolism. Increased efficiency of metabolism is beneficial when calories are scarce, but can lead to obesity when calories are plentiful.

That is a truly remarkable finding when you think about it. If these data are true, they suggest that starvation during early pregnancy caused the fetus to make epigenetic changes to its DNA that allowed it to become more efficient at energy utilization, and those epigenetic changes have lasted a lifetime – even when food was abundant throughout the rest of that lifetime.

What Is Epigenetics And Can What We Eat Affect Our Kids?

can what we eat affect our kidsThe studies I featured in this article are powerful “proof of concept” that diet and environmental exposure during conception and pregnancy can result in epigenetic changes to the DNA of the offspring that can persist throughout their life and dramatically affect their health. However, it is not yet clear how they apply to you and me.

  • Agouti mice are a very special strain of mice. It is not yet clear what effect folic acid, genistein and bisphenol A have on epigenetic modification of specific human genes, and whether those epigenetic modifications will have health consequences in humans.
  • The specific circumstances of the Dutch Hunger Winter are unlikely to be repeated on any significant scale. The closest approximation I can envision would be a woman who becomes pregnant while on a very low calorie fad diet.

There are, of course, many other examples of heritable epigenetic modifications. For example:

  • When female rats are maintained on a “junk-food diet” high in fat and sugar during pregnancy and lactation their offspring show a marked preference for high fat foods (Ong & Muhlhausler, FASB J, 25: 2167-2179, 2011). They also show epigenetic alterations of the central reward pathways that may pre-condition them to require higher intakes of fat to experience pleasure from eating.
  • When rats are fed diets deficient in omega-3 fatty acids, adolescent rats from the second and subsequent generations display marked increases in hyperactivity and anxiety (For more details, see my “Health Tips from the Professor” article titled “The Seventh Generation Revisited”.
  • In a clinical trial of 162 obese Canadian mothers who had children before and after weight loss surgery, the children born after weight loss surgery were half as likely to grow up overweight or obese as the children born before the weight loss surgery (Smith et al, Journal of Clinical Endocrinology & Metabolism 94: 4275-4283, 2009), and this correlated with epigenetic modification of genes that play a role in obesity, diabetes, cancer and heart disease (Guernard et al, PNAS 110: 11439-11443, 2013).

Taken together, the existing data suggest that our diet and environmental exposure during conception and pregnancy can cause epigenetic changes to our children’s DNA that may affect their future health in ways that we can only begin to understand at present. It is a sobering thought.

 

The Bottom Line

 

  • The term epigenetics describes modifications to our DNA that turn our genes off and on.
  • In this article I discussed two powerful “proof of concept” studies, one in rats and the other in humans, showing that diet and environmental exposure during conception and pregnancy can result in epigenetic changes to the DNA of the offspring that can persist throughout their life and dramatically affect their health.
  • The health consequences of these epigenetic modifications include obesity, diabetes, cancer, heart disease, hyperactivity, anxiety and many more.
  • This is a new paradigm. Most prenatal nutrition advice is currently based on what it takes to have a healthy baby – not on what it might take for your child to experience better health throughout their life.
  • Of course, the science of epigenetics is relatively new. It will be many years before we will be able to make specific recommendations as to what your diet should be like during pregnancy and lactation if you wish to make beneficial modifications to your baby’s DNA.
  • However, you should be aware that what you eat during pregnancy & lactation may influence the health of your children – not just at the time of their birth – but throughout their life, and that a high calorie, “junk-food” diet or a fad weight loss diet just may not be your best choice.

*The agouti mice picture is by Randy Jirtle and Dana Dolinoy (E-mailed by author) [CC BY 3.0 (http://creativecommons.org/licenses/by/3.0)], via Wikimedia Commons.

 

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure or prevent any disease.

Do Women Get Enough Omega-3 During Pregnancy?

Should Pregnant Women Take Omega-3 Supplements?

Author: Dr. Stephen Chaney

 

  • omega-3 during pregnancyLong Chain Omega-3 Fatty Acids, Especially DHA, Are Essential For Normal Brain Development

Long chain omega-3 fatty acids, especially DHA, have been shown to be very important during pregnancy, especially during the third trimester when DHA accumulates in the fetal brain at a very high rate. It is during that third trimester that the fetus forms the majority of brain cells that they will have for an entire lifetime.

Inadequate intake of long chain omega-3 during pregnancy and lactation has been shown to be associated with poor neurodevelopmental outcomes. These include poor developmental milestones, problem solving, language development and increased hyperactivity in the children (Coletta et al, Reviews in Obstetrics & Gynecology, 3, 163-171, 2010).

  • The Current Recommendation is 200 mg DHA/day During Pregnancy & Lactation.

In order to support brain development in the fetus, some experts have recommend intake of 300 mg per day of DHA during pregnancy. The best dietary sources of long chain omega-3 fatty acids such as DHA are fish and fish oil supplements. However, because of concerns about seafood contamination with heavy metals and PCBs (both of which are neurotoxins), the FDA recommended in 2004 that pregnant women limit seafood consumption to two servings a week, which amounts to about 200 mg/day of DHA – and this has been subsequently adopted by the American College of Obstetricians and Gynecologists and the European Union as the amount of DHA recommended during pregnancy and lactation (Coletta et al, Reviews in Obstetrics & Gynecology, 3, 163-171, 2010).

Even that recommendation for DHA from seafood could be overly generous. A recent study using the EPA risk assessment protocol concluded that some farmed salmon were so contaminated with PCBs that they should be eaten no more than once a year (Hites et al, Science, 303: 226-229, 2004).

  • Most Pregnant & Lactating Women In The US Are Probably Not Getting The Recommended Amount of DHA In Their Diet

Many pregnant women avoid seafood because of concerns about mercury and PCBs. Unfortunately, the other food sources of omega-3 fatty acids in the American diet, even many omega-3 fortified foods and supplements, are primarily composed of the short chain omega-3 fatty acid linolenic acid (also called alpha-linolenic acid or ALA), and only 1-4% of linolenic acid is converted to DHA in the body (Coletta et al, Reviews in Obstetrics & Gynecology, 3, 163-171, 2010).

Consequently, experts have been concerned for some time that American and Canadian women may not be getting enough DHA during pregnancy and lactation, but it was not clear how serious an issue this was.

Do Women Get Enough Omega-3 During Pregnancy?

women take enough dha omega-3 during pregnancyA group of scientists decided to test the adequacy of DHA intake by comparing DHA intake with the recommended 200 mg/day in a group of 600 pregnant and lactating women enrolled in the Alberta Pregnancy Outcomes and Nutrition study (Jia et al, Applied Physiology, Nutrition & Metabolism, 40: 1-8, 2015). The average age of the women in this study was 31.6. They were primarily Caucasian and married. 92% of them breastfed their infants. Most of them were taking a multivitamin or prenatal supplement on a daily basis. Approximately 1/3 of them were also taking a long chain omega-3 supplement.

The majority of women had completed college and had annual household incomes in excess of $100,000/year. In short, this was a very affluent, well-educated group of women. This is the kind of group one might consider most likely to be getting enough DHA from their diet.

DHA intake was based on 24 hour food recalls and supplement intake questionnaires collected in face-to-face interviews 2-3 times during pregnancy and again 3 months after delivery. The DHA content of the diet was determined from these data using well established methods.

The results were both dramatic and concerning.

  • Only 27% of pregnant women and only 25% of postpartum women who were breastfeeding met the recommendation of 200 mg of DHA/day. In short, nearly three-quarters of the women in the study were not getting enough (DHA) omega-3 during pregnancy and lactation.
  • When the women who were taking DHA-containing supplements were excluded from the data analysis, only 13% of pregnant and lactating women were getting enough DHA from their diet. In short, nearly 90% of the women relying on diet alone were not getting enough DHA.
  • Taking a DHA-containing supplement increased the likelihood of achieving the recommended 200 mg DHA/day by 10.6 fold during pregnancy and 11.1 fold during breastfeeding.
  • Not surprisingly, seafood, fish and seaweed products were the major contributors to the total dietary DHA intake.

The authors concluded “Our results suggest that the majority of participants in the cohort were not meeting the EU recommendations for DHA during pregnancy and lactation, but taking a supplement significantly improved the likelihood that they would meet the recommendations.”

 

The Bottom Line

  • Long chain omega-3 fatty acids, especially DHA, are essential for normal brain development. Inadequate DHA intake during pregnancy and lactation is associated with poor developmental milestones, problem solving, language development and increased hyperactivity in the children.
  • There is no established Daily Value for omega-3 fatty acids. However, the American College of Obstetricians and Gynecologists and the European Union recommend 200 mg DHA/day during pregnancy and lactation.
  • This recommendation is based partly on the amount of DHA needed for brain development and partly on the FDA warning that pregnant women should not consume more than 2 servings of fish/week due to heavy metal and PCB contamination.
  • This recommendation can be met by 1-2 six ounce servings/week of fish or a fish oil supplement containing 550 – 600 mg of omega-3 fatty acids.
  • Many pregnant women avoid fish because of concerns about contamination with heavy metals and PCBs, both of which are neurotoxins. Therefore, the major source of omega-3s in the American and Canadian diets are short chain omega-3 fatty acids that are only inefficiently (1-4%) converted to DHA.
  • Consequently, experts have been concerned for some time that American and Canadian women may not be getting enough DHA during pregnancy and lactation, but it was not clear how serious an issue this was.
  • A recent study done with a group of 600 women enrolled in the Alberta Pregnancy Outcomes and Nutrition study found that:
  • Only 27% of pregnant women and only 25% of postpartum women who were breastfeeding met the recommendation of 200 mg of DHA/day. In short, nearly three-quarters of the women in the study were not getting enough (DHA) omega-3 during pregnancy and lactation.
  • When the women who were taking DHA-containing supplements were excluded from the data analysis, only 13% of pregnant and lactating women were getting enough DHA from their diet. . In short, nearly 90% of the women relying on diet alone were not getting enough DHA.
  • Taking a DHA-containing supplement increased the likelihood of achieving the recommended 200 mg DHA/day by 10.6 fold during pregnancy and 11.1 fold during breastfeeding.
  • This was a very affluent, well-educated group of women. If any women anywhere are getting enough DHA during pregnancy and lactation, this should have been the group that was.
  • The authors concluded “Our results suggest that the majority of participants in the cohort were not meeting the EU recommendations for (DHA) omega-3 during pregnancy and lactation, but taking a supplement significantly improved the likelihood that they would meet the recommendations.”

 

These statements have not been evaluated by the Food and Drug Administration. This information is not intended to diagnose, treat, cure or prevent any disease.

Do NOT follow this link or you will be banned from the site!