The world of nutrition is filled with dogma, myths, and diet advice that lacks proper scientific support. If you’ve taken the time to read up on nutrition for yourself, you’ve probably realised that many of the things you’ve been told about diet and health are not only wrong, but may have caused you to make decisions that threw your health into a tailspin.
Sometimes, mainstream nutrition gets it right; however, in many instances, the opposite can be said. This doesn’t mean – like some people seem to think – that we are best off doing the complete opposite of what conventional dietary wisdom tells us to be true. Rather, it means that we should always take a closer look at the evidence before we jump to conclusions.
Since this article will cover 10 topics, I obviously won’t be able to discuss the smaller details or take an in-depth look at the scientific research on each and every item (without turning this into a 20.000 word post). Rather, this article will provide an overarching look at these topics and provide a summary of the key things you need to know to make an informed decision about what to eat.
I’ve written lengthy articles on most of the topics that are mentioned in this post before, so if you’re interested in learning more about one or more of them, just click the links that are included in each section or use the search function on the site. You can also drop by a comment in the comment section below the post if you want me to provide more information or additional sources for some of the claims made in this article.
Okay, let’s get to it…
1. The mainstream nutritional community fails to recognize the importance of Darwin’s theory of evolution via natural selection
Evolution through natural selection has been biology’s organizing principle for more than a century. Unfortunately, this powerful idea hasn’t yet made its way into the young discipline that is nutritional sciences. This is problematic for a number of reasons…
Contrary to what some people think, just looking at Randomized Controlled Trials (RCTs) and meta-analyses isn’t sufficient to be able to draw firm conclusions about how we should eat to achieve good health. While these types of studies are extremely valuable, they also have their limitations. Due to confounding variables, differences in study design, bias, and a wide range of other factors, the scientific literature is filled with studies that show conflicting results. Everyone can find a couple of studies or review papers that seemingly support their cause.
Without an evolutionary framework to guide our understanding of nutrition and the scientific literature, it’s impossible to make sense of all the seemingly conflicting diet advice and study results that are out there. The evolutionary lens allows us to look past current dietary dogma and trends and establish what types of foods humans are best adapted to eat.
It’s important to incorporate archaeological data and studies of hunter-gatherers and traditional people into nutritional sciences, because this research gives us many clues as to what types of diets humans are best adapted to eat. Moreover, it provides us with the information we need to formulate good theories and hypotheses that can be tested in RCTs.
It’s rarely possible to draw firm conclusions about how we should eat by looking at nutrition through the lens of evolution. Rather, an evolutionary outlook provides us with a foundation or template that we can use to build our ideas, research, and opinions about nutrition upon.
I don’t claim to have all the answers. Over the years I’ve adjusted my opinions on various nutrition topics as I’ve become aware of new science or read something that gave me a different perspective on things. One of the most important things this process has taught me is that I’m much less likely to make a mistake or give out diet advice that I later learn is flawed if I make sure my opinions and recommendations rest upon a solid foundation of evolutionary wisdom.
In summary, to really be able to create health-promoting dietary guidelines, it’s not enough to look at modern scientific research. We also have to consider what evolution can tell us about diet, health, and disease.
2. There’s no reason to fear coconuts, eggs, organ meats, and other Paleo-approved whole foods that contain saturated fat and/or cholesterol
For decades we’ve been told by public health authorities that we should shun foods that are high in fat and/or cholesterol and instead eat more carbohydrates. There’s now strong evidence to suggest that this advice may have done us more harm than good.
While it’s definitely true that consuming a lot of butter, bacon, GHEE, oils, and other evolutionarily novel foods with an extremely high fat density is a bad idea if you’re looking to achieve good health, there’s no reason to shun whole foods such as game meat, organic eggs, organ meats, and coconuts. Actually, I’ll argue that most people would benefit from eating more of these types of foods, in particular organ meats.
Several lines of evidence indicate that animal source food started becoming an increasingly more important part of the hominin diet approximately 2.5 million years ago (1). Furthermore, estimates based on interpretations of ethnographic data of modern hunter-gatherers show that “whenever and wherever it was ecologically possible, hunter-gatherers consumed high amounts (45–65% of energy) of animal food” (2).
In other words, the idea that we are poorly adapted to consume animal products such as organ meat and eggs isn’t supported by the evolutionary evidence. However, we have to remember that the food we have access to at the supermarket today is very different from the foods our primal ancestors consumed.
Our ancient ancestors clearly ate animal source food high in fat, but it’s important to note that wild animals tend to be leaner than domesticated ones, with a fatty acid profile characterized by less saturated fat and more polyunsaturated fatty acids (3). And it isn’t just a small difference, grain-fed animals contain 2-3 times more saturated fats than game meat and much less of the essential omega-3 fatty acids. These unfavourable changes to the fatty acid profile of the meat we eat can help explain why some studies and reports have indicated that red meat intake may increase the risk of cancer (4, 5).
In my comprehensive article on saturated fat I took an in-depth look at what the scientific literature tells us about the health implications of consuming red meat, eggs, oils, and other fatty foods. One of the main takeaways from that article is that several non-westernized populations have very low incidence of cardiovascular disease, stroke, and heart disease despite a high intake of coconut, meat, and/or other whole foods high in fat and/or cholesterol. Moreover, as shown in the list below, several recent studies and systematic reviews question the idea that saturated fat is the big bad wolf.
- A recent meta-analysis of prospective cohort studies shows that there is no significant evidence for concluding that dietary saturated fat is associated with an increased risk of cardiovascular disease or coronary heart disease (6).
- A systematic review and meta-analysis from 2010 found that consumption of processed meats, but not red meats, is associated with higher incidence of CHD and diabetes mellitus (7).
- Saturated fat consumption doesn’t associate with heart attack risk (8).
- A 14 year prospective cohort study found no associations between intake of total fat, cholesterol, or specific types of fat and risk of stroke in men (9).
As I’ve mentioned many times on the blog, this doesn’t mean that we should throw all our concerns regarding saturated fat out the window and start including plenty of high-fat cream, oils, GHEE, and other very fatty foods into our diet again. Among other things, these foods have a poor micronutrient profile, low satiety index score, and unbalanced fatty acid composition. Rather, it means that we shouldn’t worry about the saturated fat and cholesterol that are found in nutrient-dense whole foods such as meat, eggs, and coconuts.
I want to make it clear that the quality of the food matters a lot. There’s a big difference between the grass-fed meat you’ll find at the farmers’ market and CAFO-produced, fatty meats from animals that have been pumped full of antibiotics and growth hormones. In other words, when buying animal food (and food in general for that matter), it’s very important to seek out high-quality products.
3. Breakfast is not the most important meal of the day
The idea that breakfast (early in the morning) is the most important meal of the day is still very much a part of conventional dietary wisdom. This idea isn’t supported by the scientific literature or the evolutionary evidence.
Our primal ancestors obviously didn’t get up early in the morning to prepare a big, warm bowl of oatmeal – and they didn’t eat 3 set meals every day. Hunter-gatherers often don’t eat their first meal of the day until many hours after waking up. They might snack on some leftovers from last night’s “dinner” or eat some low-calorie plant foods in the morning, but the first real meal is often consumed later in the day, after the hunting and/or gathering of the day has been carried out.
However, it’s important to note that there isn’t one universal “hunter-gatherer meal pattern”. Some forager communities have been known to consume just one or two meals a day, while others eat more frequently. It all depends on how much food they manage to get a hold of. If they are able to bring down a huge animal, they may gorge on meat throughout the day, while if food is scarce, they may go longer periods without eating.
The point here is that the information we gather by looking back at humans’ evolutionary journey doesn’t support the idea that we should eat our first meal of the day shortly after we wake up. On the contrary, the evolutionary template predicts that experiencing intermittent periods of fasting is a good thing.
Modern science also suggests that this is the case. Several studies and review papers have shown that intermittent fasting can improve the lipid profile, decrease inflammatory responses, change the expression of genes related to inflammatory responses, enhance fat loss, and aid in the prevention of metabolic and cardiovascular diseases, among other things (10, 11, 12, 13).
A recent review paper had the following to say about intermittent fasting and weight loss:
It appears that almost any intermittent fasting regimen can result in some weight loss. Among the 13 intervention trials included in this review, 11 (84.6%) reported statistically significant weight loss ranging from 1.3% in a crossover trial with a 2-week intervention 23 to 8.0% in a 1-arm trial of 8 weeks’ duration. (13)
4. A carbohydrate intake of 45-65% of total energy intake is too much for most people
Public dietary guidelines in the U.S. and most other industrialized nations advocate that people should derive about 45-65% (the exact number differs from country to country) of their calories from carbohydrate. This idea, that we should all eat a diet in which carbohydrate is the dominant macronutrient, lacks proper scientific support.
Today, most people would say that a carbohydrate intake of 50 to 60% (of total calories) is “normal”, while a diet that contains 20-40% carbohydrate is a low-carb diet. However, if we look at things from an evolutionary perspective, it quickly becomes clear that it’s more accurate to label the modern, grain-based diet as a high-carbohydrate diet, while the so-called low-carb diet actually contains a percentage of carbohydrate that could be classified as the evolutionary norm for our species.
Our Paleolithic ancestors rarely ate grains and they didn’t have access to many of the other carbohydrate-rich foods that now make up a large portion of the typical Western diet. Rather, for our primal forebears, fiber-rich fruits, tubers, and vegetables were the main sources of carbohydrate available. Some prehistoric tribes may also have had access to honey, but only seasonally. In other words, they wouldn’t have consumed as much starch and simple carbohydrates as most people do today.
This statement is supported by several lines of evidence, including research which shows that hunter-gatherers typically derive about 22-40% of their total calories from carbohydrate (2). The exact amount depends on factors such as climate, season, geographic location, etc. (14).
Some systematic reviews (i.e., 15, 16) have shown that low-carb diets may be superior to low-fat diets when it comes to fat loss, metabolic health outcomes, and cardiovascular disease risk. However, it’s important to note that studies in this area show conflicting result. Why? Largely because there are a whole range of other factors besides the macronutrient composition of the diet that impact our health (E.g., a low-carbohydrate diet that’s high in bacon, oils, and butter is very different from a low-carbohydrate diet that’s high in seafood, eggs, avocados, and coconuts).
This one of the many reasons why just looking at the randomized controlled trials and systematic reviews on diet and health isn’t always sufficient to be able to draw firm conclusions. We have to look at the bigger picture, and perhaps most importantly, we have to look at what our ancestors can teach us about how to eat for good health.
In my 4-part series on carbohydrate intake I took an in-depth look at what the scientific literature tells us about carbohydrate intake and human health. One of the key takeaways from those articles is that the most important thing is that we eat the types of foods we are best adapted to eat, not that we strive to achieve a specific carbohydrate/fat/protein ratio. However, this doesn’t mean that the macronutrient composition of the diet is irrelevant.
As I point out in those articles, the weight of the evidence indicates that a carbohydrate intake of 45-65% of total calories is higher than optimal for most people. This is not to say that ditching all starchy foods and relying of butter, GHEE, oils, and other foods with an extremely high fat density as your staple foods is a good idea. Rather, it means that a diet that contains a more balanced proportion of the different macronutrients is superior to a carb-heavy diet.
5. Calorie counting is not the way to go for sustained weight loss
If you’ve been reading about weight loss in a health & fitness magazine or listened to the aerobic instructor at your gym talk about effective fat loss strategies, you may have gotten the impression that the best way to lose weight is to count calories and spend hours sweating away on the treadmill every week. Is that all it’s about? Should we just tell people who want to lose weight to eat less and move more? No, as I’ve repeatedly highlighted here on the blog, this advice is way too simplistic..
Studies have shown again and again that calorie counting isn’t very effective for long-term weight loss. The explanation for this is simple: The human body is not just a passive vehicle that comes along for the ride, but rather a complex system that has its own mechanisms for regulating how much fat we carry.
Microorganisms in the gut, hormones (in particular the adipose-secreted satiety hormone leptin), etc. play a key role in regulating our basal metabolic rate, hunger, and satiety. When we deliberately restrict calories over time (without changing any other variable), we typically get hungry, lethargic, and fatigued. When we can no longer endure this situation, we usually end up eating to satiety again, and we start gaining back the lost weight.
While it’s certainly true that to lose weight, you have to expend more energy than you take in, simply telling someone who wants to lose weight to “eat less and move more” is not particularly good advice. To really be able to achieve sustained weight loss, we also have to consider how our food choices, lifestyle, and environment impact our hormone levels, gene expression, and gut microbiota, among other things. This is not to say that individuals who want to lose weight don’t need to pay any attention to portion control and energy intake. It’s just highlights the fact that simply focusing on calories in vs. calories out is way too simplistic.
The bacteria in your gut play a key role in controlling your hormone levels, appetite, and food preferences. While a healthy community of gut bacteria will help you maintain a lean, well-functioning body, a dysbiotic gut microbiota can make you metabolically deranged and “addicted” to sugar.
To lose weight, it’s very important to eat the right types of foods. Studies consistently show that individuals who adopt a Paleolithic, hunter-gatherer style diet reduce their total energy intake – sometimes by as much as 30% – even though they are allowed to eat as much food as they want (ad libitum) (17, 18, 19, 20). There are several possible explanations for this effect. Perhaps most importantly, the Paleo Diet is low in starch and simple carbohydrates and high in protein and fiber, characteristics that make for a good weight loss diet. Moreover, Paleo-approved foods have a high satiety index score and are very micronutrient dense.
Finally, while nutrition is very important in the context of body fat regulation and weight loss, it’s not the only thing to consider. Sleep, pharmaceutical use, physical activity, and all of the other lifestyle factors I discuss on this site are also key things to keep in mind. If you’re sleeping 4 hours every night and/or taking a bunch of microbiome-disrupting prescription meds, you’re not going to build a lean, healthy body regardless of how healthy you eat.
6. A “high” intake of protein (>20% of total calories) is not dangerous
Conventional dietary wisdom says that “high-protein diets” (>20% of total calories from protein) can damage your kidneys and increase your risk of developing a wide range of chronic diseases, including colon cancer and cardiovascular disease. As I’ve highlighted many times here on the blog, this theory makes little sense from an evolutionary perspective.
As mentioned in the section on saturated fat and cholesterol, several lines of evidence indicate that animal source food started becoming an increasingly more important part of the human diet starting approximately 2.5 million years ago. Moreover, it’s well established that animal food is an essential part of the diet of most hunter-gatherer tribes (estimates suggest that hunter-gatherer typically derive about 45–65% of their energy from animal food (2)), and that without meat, we would never have been able to develop our huge, energetically costly brains.
This is not to say that the stereotypical image of a prehistoric man who’s eating huge chunks of meat every day necessarily reflects how things were like. It just means that there’s little doubt that we are well-adapted to eat fairly large amounts of animal source food.
Studies have shown that while high-protein diets may be hazardous for individuals with chronic kidney disease, a high protein intake doesn’t seem to impair renal function in people with healthy kidneys (21, 22, 23).
While it’s certainly true that a high protein intake can be harmful if the protein is coming from processed fatty meat, protein powder, milk, and other evolutionarily novel, unhealthy foods, it’s not true if the protein is coming from wild-caught seafood, grass-fed meats, organic eggs, and other high-quality animal products.
As I’ve pointed out throughout my articles on this topic, protein increases satiety and thermogenesis to a greater extent than carbohydrate and fat (24, 25). Moreover, high-protein diets potentially improve leptin sensitivity in the central nervous system (26). It’s therefore not susprising that the weight of the evidence indicates that diets that contain relatively high amounts of lean protein are superior to low-protein diets when it comes to fat loss and metabolic health outcomes (27, 28, 29, 30). In other words, eating “enough” high-quality protein throughout the day is particularly important for those who carry a lot of unwanted body fat around their abdomen, which is pretty much everyone these days.
In his excellent 3-part series titled “Evolution and High Protein Diets“, Dr. Loren Cordain discusses what the scientific literature has to tell us about high-protein diets and human health. He concludes with the following:
“The evolutionary evidence indicates that so called “high protein diets” (20-30% total energy) and “very high protein diets” (30-40% total energy) actually represent the norm which conditioned the present day the human genome over more than 2 million years of evolutionary experience. The evolutionary template would predict that human health and well being will suffer when dietary intakes fall outside this range. Hence the current U.S. consumption of protein (15% total energy) may not optimally promote health and well being. There is now a large body of experimental evidence increasingly demonstrating that a higher intake of lean animal protein reduces the risk for cardiovascular disease, hypertension, dyslipidemia, obesity, insulin resistance, and osteoporosis while not impairing kidney function.” (31).
Perhaps needless to say, this doesn’t mean that eating a diet that’s low in plant foods and very high in red meat, cottage cheese, and eggs is the way to go for good health. A healthful diet contains a “balanced” proportion of both plant foods and high-quality animal source food.
7. The mainstream nutritional community fails to recognize the important role the gut microbiome plays in nutrition and health
The trillions of microorganisms that live in our gut play a key role in regulating our immune system, metabolism, and intestinal barrier function. Gut bacteria also break down some of the food we eat, exert control over our appetite and eating behaviour, and impact on our mood and mental state. Unfortunately, the mainstream nutritional community fails to fully recognize the important role this microbial rainforest plays in nutrition and health.
A lecture or two on dietary fiber, colonic fermentation, and short-chain fatty acids is often all that’s included in the curriculum of nutrition studies. Over the last couple of years, more focus has been placed on topics such as probiotics and prebiotics. However, there’s still a long way to go before the gut microbiome is given the attention it deserves
Perhaps even more worrying, the gut microbiome isn’t taken into account when dietary guidelines for the public are created. Pretty much all of the attention is given to how we can best feed the human part of ourselves, while the microbial part is neglected.
This is problematic for a number of reasons, some of which I’ve listed below.
- Our dietary habits shape the gut microbiome. For example, a low-fiber, Western pattern diet selects for a gut microbiota that looks very different from the one you get if you eat a fiber-rich, ancestral diet. To really be able to design a healthy diet, we have to consider how our food choices impact the microbial community in our gut.
- It’s not enough to eat a nutrient-dense, healthy diet. To achieve good health, you also need a gut microbiome that is adapted to this diet. In other words, the consumption of a healthy diet doesn’t produce health if the gut microbiome is mismatched to the diet. A lot of people experience gastrointestinal distress and food intolerance because they lack the bacteria that are needed to properly break down the food they’re eating.
- The bacteria in our gut play a key role in regulating how much body fat we carry and have even been shown to exert control over our appetite and eating behaviour. As I’ve pointed out many times on the blog, one of the reasons people have trouble sticking to a healthy eating pattern and/or lose weight is that they harbour a dysfunctional gut microbiota. This is something that has to be taken into account when diet and weight loss advice is given.
- One of the main ways we acquire new gut bacteria is through eating food. For example, raw plant foods and fermented vegetables harbour a wide range of microorganisms. These food-borne microorganisms may play an important role in shaping our health.
8. Cereal grains shouldn’t form the foundation of your diet
Today, cereal grains such as wheat and barley are important staple foods all around the world, and a lot of people would probably say that a diet that is devoid of grains is a very strange or peculiar diet. However, as the readers of this blog know well, it hasn’t always been like this. Cereal grains didn’t make their way into the human diet in any significant quantities until the Agricultural Revolution started sweeping the globe about 10.000 years ago. This may seem like a long time ago, but from an evolutionary perspective, it’s merely a couple of ticks on the clock.
When compared to the tiny microorganisms that live all around us and in and on our bodies, humans evolve at an extremely slow pace. As I’ve repeatedly pointed out here on the blog, we’re still – to a significant extent – hunter-gathers from a genetic perspective. There has been inadequate time and selection pressure for the human body to adapt to many of the recent changes in the human diet. This includes, among other things, the dramatic increase in cereal grain consumption.
An increased incidence of a wide range of diet-related health problems, including iron deficiency anemia, tooth decay, and several bone mineral disorders, accompanied the Agricultural Revolution, either at first or gradually (32, 33, 34). Moreover, early farmers didn’t grow as tall as preagricultural humans (32, 33). All of these unfavourable health effects came largely as a result of the transition from a Paleolithic diet to “Neolithic”, grain-based diet, a diet that in comparison was low in many vitamins and minerals and high in starch and antinutrients (32, 33, 34).
This dietary transition undoubtedly resulted in changes to the human microbiome – especially oral and gut – and altered gene expression patterns. Suddenly we collided with a diet that natural selection had never adapted us for.
Cereal grains didn’t end up at the bottom of the food pyramid because there’s strong scientific evidence to show that a grain-based diet is healthy for us. Rather, one of the key reasons they ended up there is that cereal grains are a cheap and accessible source of calories that have long been an essential part of the human diet all over the world. As Denise Minger explains in here book “Death by Food Pyramid”, the science underlying the food pyramid is far from rock solid.
The advice to eat more whole grains is partly based on studies that have investigated the health effects of consuming whole grains vs. refined grains. Naturally, these studies conclude that whole grains are superior. However, when intervention studies compare the “healthfulness” of a diet that’s rich in whole grains with the healthfulness of a Paleo Diet, which is completely devoid of grains, The Paleo Diet is consistently shown to be superior (35, 36, 37, 38, 39).
I’ve discussed the negative health effects associated with the consumption of cereal grains here on the blog before, so I’m not going to delve into this again. Rather, I thought I’d provide a brief summary of some of the key reasons why putting grains at the bottom of your food pyramid may not be the best idea.
- Cereal grains have a very high carbohydrate density. This may not be a problem for those who are very physically active, but for the majority of people, it is. For millions of years, our ancestors got most of their carbohydrates from fibrous fruits, nuts, seeds, and tubers and other vegetables. When compared with cereal grains, these foods have a very low carbohydrate density and elicit a low insulin response. I’ll argue that these are the types of foods we are best adapted to eat. Consuming a diet rich in cereal grains and/or other high-carbohydrate foods is a particularly bad choice if you want to lose weight and/or are insulin resistant.
- Cereal grains contain a wide range of proteins and antinutrients (e.g., lectins, phytic acid) that may impair nutrient absorption, disrupt normal gut physiology, and contribute to the manifestation of chronic inflammation and autoimmune diseases by increasing intestinal permeability and initiating a pro-inflammatory immune response (32, 40, 41, 42).
- When compared with fruits and vegetables, cereal grains are less nutrient dense (on a calorie-by-calorie basis) and lower in prebiotic fiber (32, 43) .
- Cereal grains are deficient in one or more vitamins and essential amino acids (34). Moreover, none of the domesticated cereals have adequate iron, calcium, or zinc (34). In itself, this doesn’t say that cereal grains are unhealthy. However, it highlights one of the many problems of relying on cereal grains as a staple food.
- Cereal grains, in particular refined grains, may alter the gut microbiota and increase the absorption of bacterial endotoxins from the small intestine (35).
- Cereal grains such as wheat contain opioid peptides that bind to opioid receptors in the brain and may trigger addictive-like responses (44).
9. Most vegetable oils aren’t healthy
Conventional nutritional wisdom says that margarine and vegetable oils such as sunflower oil and canola oil are healthier than oils that are high in saturated fat (e.g., coconut oil) and animal fats such as lard and butter. This common belief was imprinted in the public’s mind by dietary guidelines, which for decades have advocated that we should shun saturated fats and instead opt for foods that are high in monounsaturated and polyunsaturated fatty acids.
Over the last century, vegetable oils have made their way into all sorts of processed food products. Moreover, a lot of people use generous amounts of vegetable oils every day for cooking, salad dressings, etc. This is unfortunate, because a high intake of vegetable oils may have a wide range of negative health effects.
Vegetable oils, which can be broadly defined as a large group of oils that are esters of fatty acids and glycerol, obtained from the leaves, fruit, or seeds of plants, differ in their micronutrient composition and fatty acid profile. Not all vegetable oils are equally bad. For example, virgin olive oil and coconut oil are far superior to oils made from corn, sunflowers, soybeans, and canola.
Recent research has shown that cooking with corn and sunflower oil can release high levels of toxic chemicals known as aldehydes (45). These aldehydes may play a role in a wide range of chronic diseases, including cancer, heart disease, and dementia. In contrast, the heating of butter, olive oil, coconut oil, and lard produces much lower levels of aldehydes.
The main problem with vegetable oils is that they tend to have a very high omega-6/omega-3 ratio. (Most vegetable oils have an omega-6/omega-3 ratios of more than 3). The high omega-6/omega-3 ratio of the current Western diet can largely be attributed to the massive infusion of vegetable oils starting in the early 1900s (46).
As highlighted in the quote below, a high intake of omega-6 and/or a low intake of omega-3 can promote the pathogenesis of several chronic diseases.
Anthropological and epidemiological studies and studies at the molecular level indicate that human beings evolved on a diet with a ratio of omega-6 to omega-3 essential fatty acids (EFA) of approximately 1 whereas in Western diets the ratio is 15/1 to 16.7/1. A high omega-6/omega-3 ratio, as is found in today’s Western diets, promotes the pathogenesis of many diseases, including cardiovascular disease, cancer, osteoporosis, and inflammatory and autoimmune diseases, whereas increased levels of omega-3 polyunsaturated fatty acids (PUFA) (a lower omega-6/omega-3 ratio), exert suppressive effects. Increased dietary intake of linoleic acid (LA) leads to oxidation of low-density lipoprotein (LDL), platelet aggregation, and interferes with the incorporation of EFA in cell membrane phospholipids. Both omega-6 and omega-3 fatty acids influence gene expression. Omega-3 fatty acids have anti-inflammatory effects, suppress interleukin 1beta (IL-1beta), tumor necrosis factor-alpha (TNFalpha) and interleukin-6 (IL-6), whereas omega-6 fatty acids do not. Because inflammation is at the base of many chronic diseases, dietary intake of omega-3 fatty acids plays an important role in the manifestation of disease, particularly in persons with genetic variation, as for example in individuals with genetic variants at the 5-lipoxygenase (5-LO). (47)
10. Milk is not a health food
Milk is the ultimate health food, right? If you’ve seen one of the many “Got Milk?” ads showing celebrities with a milk mustache or read about the importance of getting enough calcium into your body in order to build robust, strong bones, this is what you may have been led to believe.
The dairy industry has truly done a remarkable job of selling us on the idea that this white, liquid substance – which wasn’t a part of the human diet for 99%+ of our evolutionary history – is a health food that should be a part of everyone’s breakfast. As I’ve pointed out many times here on the blog, the reality is something very different.
Approximately 70% of the world’s population produce low levels of lactase, the brush-border enzyme that is responsible for breaking down lactose in the human small intestine. Unless they harbour a gut microbiome that is adapted to break down lactose, these individuals typically experience gastrointestinal discomfort when they consume milk and other lactose-containing dairy products.
Some people (primarily those from Northern Europe and their descendants) retain high gut lactase activity as adults and don’t experience gastrointestinal distress when they consume lactose-containing food. However, it’s important to note that this doesn’t necessarily mean that drinking milk is healthy for them. We have to keep in mind that natural selection does not select for health, but only for reproductive success.
This quote helps explain why consuming milk may not be the best idea for adult humans:
All mammals, including humans, are intended to be nourished during infancy by milk from their mother. Part of the very definition of a mammal is that the female of the species has milk-producing glands in her breasts which provide nourishment for her young. Each species of mammal produces its unique type of milk designed specifically to strengthen the immune system and provide nourishment for their babies, which are weaned after their birth weight has approximately tripled.
So, absolutely yes, “milk is a natural”… in the proper context. It is perfectly natural for infant mammals, including humans, to be nourished exclusively by milk from their mother’s breasts. So if we are talking about human breast milk for babies, yes, “milk is the perfect food.” And yes, during infancy when we have no teeth for eating solid food, and as we need to strengthen our immune system, “everybody needs milk”. (48)
As shown in the list below, the scientific evidence suggests that milk consumption – in particular consumption of the non-organic, low-fat, pasteurized, and homogenized milk that most people drink these days – may do you more harm than good.
- Milk contains a wide range of hormones and bioactive peptides, some of which breach the gut barrier and interact with our immune system (49). Perhaps needless to say, these compounds were designed by natural selection to support the growth of newborn calves, not to enhance health and longevity in adult humans. However, it’s important to note that it’s still unclear how many of these compounds affect our physiology and general health.
- Pasteurization and homogenization can force milk casein and fats into new configurations that make the proteins stackable into fibers/amyloids (50). These milk protein fibers may play an important role in diseases such as type I diabetes and Alzheimer’s disease (50).
- Milk has been implicated in the pathogenesis of a wide range of chronic diseases and health disorders, such as heart disease, insulin resistance, acne vulgaris, and Parkinson’s disease (49, 51, 52).
- Milk is less nutrient dense than fruits, vegetables, seafood, and other Paleo-approved foods (3).
- When compared to the foods that made up the bulk of Paleolithic diets, milk is extremely high in calcium. Contrary to what dairy lobbyists want you to believe, this could actually be a bad thing, as an abnormally high intake (from an evolutionary perspective) of calcium may cause mineral imbalances and increase the risk of heart attacks, among other things (49). Moreover, several large meta-analyses have shown that calcium intake is not significantly associated with hip fracture risk in women or men (53, 54, 55). One of these analyses even found that calcium supplementation may increase hip fracture risk (56). Yes, calcium is important, but maybe we’re better off getting it from green vegetables?
Frank Oski, M.D, former Director of the Department of Pediatrics of Johns Hopkins University School of Medicine and Physician-in-Chief of the Johns Hopkins Children’s Center, had the following to say about milk consumption in his book “Don’t Drink Your Milk!”:
Among physicians, so much concern has been voiced about the potential hazards of cow milk that the Committee on Nutrition of the prestigious American Academy of Pediatrics, the institutional voice of practicing pediatricians, released a report entitled, ‘Should Milk Drinking by Children Be Discouraged?‘ Although the Academy’s answer to this question has (as of this writing) been a qualified ‘maybe,’ the fact that the question was raised at all is testimony to the growing concern about this product, which for so long was viewed as sacred as the proverbial goodness of mother and apple pie. (48)
Now I want to hear from you: Have you been tricked into making poor diet choices by conventional dietary wisdom? Do you have any comments or questions to any of the 10 things I mention in this post? Do you have any additions to the list?
Pictures: 1: Creative Commons (CC) picture by bigbrand ., 2: Wikimedia Commons picture, 3: CC picture by Haflz Issadeen, 4: CC picture by jeffreyw, 5: CC picture by Susie Inverso 6: CC picture by Foodfacts pm, 7: Freeimages.com picture, 8: CC picture by AJ Cann, 9: CC picture by cristian, 10: CC picture by Cottonseed Oil, 11: Freeimages.com picture by iliana .