Thesis #21: Civilization makes us sick.

by Jason Godesky

The Paleolithic was not an era of perfect health. The Neanderthals, for instance, show signs of trauma consistent with those of rodeo cowboys–suggesting a certain rough and tumble life with big game. They were certainly a few diseases in circulation, and of course things happened. However, the claim so often made by progressivists that civilization has made us healthier could not be more incorrect. Civilization has most definitely made us much less healthy, and in innumerable ways.

The first has been the introduction of the epidemic disease. Epidemiologists typically divide diseases into one of two broad categories: endemic and epidemic. Endemic diseases are always circulating in a population. Most members of the population have some immunity to it. Endemic diseases can be serious, but for the most part, they are accepted as a simple fact of life, as the population grows used to them. Chicken pox is endemic to most First World populations, for example. Formally, an endemic is an infection that can be maintained in a population without external inputs. Mathematically, an endemic is a steady state, R0 x S = 1, where every single individual who is infected passes the infection on to exactly one other person. If the rate of contagion is less than that, the infection will simply die out. If it is more, it will become an epidemic.

Epidemics are another thing altogether. Epidemics are new to a population, and so burn through it without meeting any immune response whatsoever. Epidemics burn themselves out quickly, but leave much mortality and suffering in their wake. Eventually, some will begin to develop an immune response, and eventually the epidemic will kill or infect everyone it can–leaving only the immune alive (with the exception of some minority protected by the “herd effect,” who cannot be infected because they’re surrounded by people who are immune). The Plague which ravaged Europe several times over was an epidemic; each iteration was slightly less devastating than the last, as each left a larger segment of the population with immunity. When an epidemic infects the worldwide population of a species, it is a pandemc.

The epidemic disease is something new, a gift of civilization. Most epidemics are zoonotic–they come from animals. That is how we become exposed to so many unfamiliar pathogens, because once a pathogen mutates sufficiently to jump the species barrier, what was endemic to our domesticates is epidemic to us. Chicken pox, easles, smallpox, influenza, diphtheria, HIV, Marburg virus, anthrax, bubonic plague, rabies, the common cold, and tuberculosis all came from animal domestication. If epidemic diseases did arise in the Paleolithic, they were short-lived: hunter-gatherer bands were too small, and had contact with one another too infrequently to allow an epidemic to spread. It may have wiped out the whole band, but it would die out there. Domestication brought humans into sufficiently close contact with other animal species to allow their germs to adapt to our bodies, created concentrated populations where diseases could incubate, and even provided long-range trade to export those germs, once fully developed, to other concentrated populations. In Guns, Germs & Steel, Jared Diamond points to these titular germs as one of the main reasons that civilization was able to destroy all other societies. By the time the conquistadors had set into the New World, smallpox had already wiped out 99% of the native population.

Civilization did not only introduce us to disease as we know it, though. It also introduced a novel way of life that was completely at odds with the evolutionary expectations of the human body. Humans remain Pleistocene animals; the short 10,000 years since the end of the last ice age has been meager time to adapt ourselves to such a radically different way of life. One factor that aided the spread of such disease was the rampant malnutrition that accompanied the Neolithic. Where foragers rely on a vast diversity of life that is nearly impossible to eliminate, and thus almost never starve, agriculture introduced the concept of “famine” to humanity be relying completely and utterly on a small number of closely related species. Starvation in the Neolithic was rather the norm. In “The Worst Mistake in the History of the Human Race,” Jared Diamond wrote:

One straight forward example of what paleopathologists have learned from skeletons concerns historical changes in height. Skeletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5’ 9″ for men, 5’ 5″ for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5’ 3″ for men, 5’ for women. By classical times heights were very slowly on the rise again, but modern Greeks and Turks have still not regained the average height of their distant ancestors.

Another example of paleopathology at work is the study of Indian skeletons from burial mounds in the Illinois and Ohio river valleys. At Dickson Mounds, located near the confluence of the Spoon and Illinois rivers, archaeologists have excavated some 800 skeletons that paint a picture of the health changes that occurred when a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150. Studies by George Armelagos and his colleagues then at the University of Massachusetts show these early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced bya bone condition called porotic hyperostosis), a theefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor. “Life expectancy at birth in the pre-agricultural community was bout twenty-six years,” says Armelagos, “but in the post-agricultural community it was nineteen years. So these episodes of nutritional stress and infectious disease were seriously affecting their ability to survive.”

Over the course of millennia, we have gradually recovered from the enormous mortality of the Neolithic, to the point where most First Worlders now enjoy a quality of life just shy of our Mesolithic ancestors. That doesn’t mean our current diet is healthy, only that it is plentiful enough to keep us alive. Boyd Eaton called it “affluent malnutrition”–we eat a great deal of food, but what we eat is horribly maladapted to the human body. Affluent malnutrition is so lacking in basic micronutrients that many of us require vitamin supplements. Other criteria of affluent malnutrition include:

  • Highly processed foods that are deficient in important vitamins and minerals
  • Synthetic food compounds
  • High in refined sugars
  • High in saturated fat
  • Deficient in fibre
  • Mega-size portions
  • High in calories

The human body evolved to expect a diet primarily of animals. Fat provided most of the body’s energy, and protein provided the necessary materials for the large human brain. Wild edibles provided vitamins and minerals in abundance. A single cup of crushed dandelion leaves contains more vitamin C than 2 glasses of orange juice.

Instead, some 99% of the world’s current diet is supplied by either wheat, rice or corn. Ben Balzer’s “Introduction to the Paleo Diet” outlines the main problems with these cereal grains and their adaptation to the human body:

Consider our friend, the apple. When an animal eats an apple, it profits by getting a meal. It swallows the seeds and then deposits them in a pile of dung. With some luck a new apple tree might grow, and so the apple tree has also profited from the arrangement. In nature as in finance, it is good business when both parties make profit happily. Consider what would happen if the animal were greedy and decided to eat the few extra calories contained within the apple seeds- then there would be no new apple tree to continue on the good work. So, to stop this from happening, the apple seeds contain toxins that have multiple effects:

  1. Firstly, they taste bad- discouraging the animal from chewing them
  2. Secondly some toxins are enzyme blockers that bind up predators digestive enzymes- these also act as “preservatives” freezing the apple seed enzymes until sprouting- Upon sprouting of the seed, many of these enzyme blockers disappear.
  3. Thirdly, they contain lectins- these are toxic proteins which have numerous effects. They act as natural pesticides and are also toxic to a range of other species including bacteria, insects, worms, rodents and other predators including humans.

Of course, the apple has other defenses- to start with it is high above the ground well out of reach of casual predators, and it also has the skin and flesh of the apple to be penetrated first. Above all though is the need to stop the seed from being eaten, so that new apple trees may grow.

Now, please consider the humble grain. Once again as a seed its duty is mission critical- it must perpetuate the life cycle of the plant. It is however much closer to the ground, on the tip of a grass stalk. It is within easy reach of any predator strolling by. It contains a good source of energy, like a booster rocket for the new plant as it grows. The grain is full of energy and in a vulnerable position. It was “expensive” for the plant to produce. It is an attractive meal. Its shell offers little protection. Therefore, it has been loaded with toxic proteins to discourage predators- grains are full of enzyme blockers and lectins. You may be surprised to learn that uncooked flour is very toxic…

Once again, it is a simple matter of adaptation. In fact, some varities of anthropoid have adapted to eating grain in the past, such as Paranthropus bosei; however, that is a variety that is unrelated to us. We are descended from the Australopithecus branch, which focused on scavenging while Paranthropus focused on grain, and died out. Humans lack the necessary enzymes to digest these cereal grains properly, the way birds do. Instead, they lead to a host of health problems–including, possibly, cancer.

Lectins–found in cereals, potatoes, and beans–have effects throughout the body as widespread and significant as our own hormones, but originating from outside our bodies, they react with our physiology in ways that are often quite harmful. They can strip off protective mucous tissues, damage the small intestine, form blood clots, make cells react as if stimulated by a random hormone, stimulate cells to secrete random hormones, make cells divide at improper times, cause lymphatic tissues to grow or shrink, enlarge the pancreas, or even induce apoptosis. In an editorial for the British Medical Journal titled “Do dietary lectins cause disease?” David L J Freed answers the question affirmatively, writing:

Until recently their main use was as histology and blood transfusion reagents, but in the past two decades we have realised that many lectins are (a) toxic, inflammatory, or both; (b) resistant to cooking and digestive enzymes; and (c) present in much of our food. It is thus no surprise that they sometimes cause “food poisoning.” But the really disturbing finding came with the discovery in 1989 that some food lectins get past the gut wall and deposit themselves in distant organs.

The question of whether or not the lectins in grain causes cancer is still open, but there is certanly a good deal to suggest it. Lectins are well-known to cause cancer-like reactions in colon cells in a test tube. Franceschi, et.al; “Intake of macronutrients and risk of breast cancer” (Lancet 1996;347(9012):1351-6) showed that while risk of breast cancer went down with total fat intake, it rose with carbohydrate intake, but the original study on a correlative “cause” of cancer remains the most compelling: Stanislaw Tanchou’s 1843 study that found a nearly perfect correlation between cancer in major European cities, and grain consumption. Tanchou predcted that no forager would ever be found with cancer, initiating a frenzied search to find the counterproof. Though no such forager was ever found, cancer often became commonplace among those same populations once they were settled into an agricultural lifestyle. Between the effects of grain and more recent environmental factors, it seems evident that the natural occurence of cancer among our foraging ancestors must have been negligible. In the modern United States, some 50% of men and 33% of women will suffer from some kind of cancer. Among foragers, we have significant difficulty producing even a single example.

There is also significant and widespread intolerance to grain. Writing of intolerance to the gluten in grain in “Why So Many Intolerant To Gluten?” Luigi Greco writes:

Having had over 25 years of variegated experience with gluten intolerance I find hard to imagine that the single most common food intolerance to the single most diffuse staple food in our environment might provoke such a complexity of severe adverse immune-mediated reactions in any part of the human body and function. The list is endless, but malignancies, adverse pregnancy outcome and impaired brain function are indeed complications above the tolerable threshold of this food intolerance.

Dairy is also a new and disastrous introduction to the human menu. All mammals lose their ability to produce lactase–the enzyme that breaks down the lactose in milk–when they reach maturity. At about 4000 BCE, a mutation occured in Sweden and the Middle East, allowing those populations to continue producing lactase into maturity. This was a useful adaptation in their societies, with their adoption of herds of domesticated cattle, and so the mutation spread. However, “lactose intolerance” remains the norm across most human populations. The prevalence of this bizarre mutation amongst the socio-politically powerful northern Europeans has led to a strange stuation where the normal state of affairs is referred to as if it were a malady. While humans with this mutation can digest milk, it remains something that the human body is ill-equipped for. Cow milk is tailor-suited for calves, just as human mlk is suited for human babies–but the requirements of cows differ markedly from humans. Consumption of cow milk has been linked to iron deficiency anemia, allergies, diarrhea, heart disease, colic, cramps, gastrointestinal bleeding, sinusitis, skin rashes, acne, increased frequency of colds and flus, arthritis, diabetes, ear infections, osteoporosis, asthma, autoimmune diseases, and more, possibly even lung cancer, multiple sclerosis and non-Hodgkin’s lymphoma.

The “Paleolithic Diet” is often referred to as a “low-carb diet,” which it is, and while it retains the weight-reducing properties of the more popular Atkins diet (since your body does not know how to turn protein or fat into body fat, but only carbohydrates), it is far more sustainable and conducive to long-term health than Atkins. Individuals on a Paleolithic Diet report not only dramatic weight loss, but less hunger, more energy, and even greater mental acuity. Even in civilization, taking up only the diet of a forager leads to dramatic improvements in health.

The “diseases of civilization” are so well known as to hardly bear repeating. While we work far longer hours than even the most overworked forager, our work is quite different. Affluent First Worlders are too busy working extraordinarily long hours sitting behind desks to exercise, while agrarian societies emphasize back-breaking labor for the torso, back and arms. A cursory examination of the human body’s construction shows that it is adapted best to one activity: walking. Whether hunting or gathering, most of a forager’s short work day consists simply of walking for hours at a time. The sedentism of First World life has led to a host of maladies.

At the same time, we suffer from the psychosomatic and mental disorders that are the result of such stressful lives. We are primates adapted to small, egalitarian bands, but we find ourselves locked into large-scale, hierarchical societies. Even primates that are adapted to hierarchy show signs of stress when they occupy the lower ranks–and theirs are hierarchies that are not nearly as pyramidal, as if to increase the number of stressed-out unfortunates as much as possible. Our personality and our ability to cope can allow us to survive such a maladaptive situation, but we feel it all the same, particularly with the constant, ever-escalating competitiveness of a civilization that must always grow or die. High stress is endemic to the civilized population. It has become the leading cause of death in the United States. At the same time, while one quarter of U.S. citizens suffer from some form of mental illness, one would be hard-pressed to find any examples of mental illness among foragers.

Indeed, even those maladies which we consider to be merely the onset of old age, such as frailty and senility, are difficult to find among foragers, suggesting that even these may be the result of a maladapted, civilized diet.

Pleistocene humans were not always in perfect health, but the natural state of health for most animals in the wild is far, far superior to that which we find ourselves in. Humans did not evolve to be unique in the animal kingdom for our sickly, malnourished, and weak forms. We evolved to enjoy the same level of health as every other animal, but for 10,000 years, we have lived contrary to human nature, creating a great deal of stress and mental anguish. We eat foods that are not entirely edible for us as staples, and in ever-increasing quantities to counterbalance their anti-nutritional effects.

Trackbacks & Pingbacks

  1. […] You are what you eat, right? It’s more than a simple cliché; it’s a biological truth. The foods you eat are broken down chemically into the building blocks of your body. You are literally made of food. Civilization makes us sick, largely because it feeds us foods we have no adaptation to. Our civilized bodies are made of alien food–bird food, milk for baby calves, but precious little human food. Foragers enjoy a far better life than we do, on many levels, but the transition from civilized to forager is not an easy one. How much of a benefit could we expect, even if all the stress and iniquity of our civilized lives remains constant, just from eating like a forager? […]

    Pingback by Going Paleo » The Anthropik Network — 28 February 2006 @ 8:39 PM

  2. […] About This Site « Thesis #21: Civilization makes us sick. Methods of Freedom: Non-Hierarchal Thought » […]

    Pingback by Thesis #22: Civilization has no monopoly on medicine. (The Anthropik Network) — 17 October 2006 @ 10:39 AM

  3. […] Nothing in human existence has had a more profoundly negative impact on our quality of life than civilization. As we have already seen, it introduced the unnecessary evil of hierarchy (see thesis #11); it introduced the difficult, dangerous, and unhealthy agricultural lifestyle (see thesis #9); it makes us sick (see thesis #21), but provides no better medicine to counterbalance that effect (see thesis #22). It introduced endemic levels of stress, a diet and lifestyle maladapted and deleterious to our health, war as we know it, and ecological disaster, but it has given us nothing to counterbalance those effects; it has no monopoly on medicine, or knowledge in general (see thesis #23), or even art (see thesis #24), making the overall impact of civilization on quality of life disastrous. […]

    Pingback by Thesis #25: Civilization reduces quality of life. (The Anthropik Network) — 17 October 2006 @ 10:40 AM

  4. […] We know that civilization makes us sick, and that we have not developed any superior medicine to balance that out. Humans never evolved to eat grains, yet we eat them almost exclusively. Even people who remain entirely civilized in all other respects but adopt a “Paleo Diet” eschewing grains note remarkable improvements in health. What should surprise us would be anything else. Humans evolved as hunter-gatherers; wouldn’t it make sense for us to be healthy in our evolutionary niche, and for our health to suffer when we are not? Archaeologists have noted a “Neolithic Mortality Crisis,” during which longevity dropped off precipitously by half or more. This is dramatically illustrated by the finds at Dickson’s Mounds.33 […]

    Pingback by “The Savages are Truly Noble” (The Anthropik Network) — 10 May 2007 @ 3:27 PM

  5. […] that it is a difficult, dangerous and unhealthy way of life (thesis #9), that it makes us sick (thesis #21), and that it cannot provide medicine (thesis #22) or knowledge (thesis #23) beyond that which is […]

    Pingback by The Anthropik Network » Thesis #24: Civilization has no monopoly on art. — 31 July 2007 @ 2:54 PM

  6. […] is that although a junk food diet is bad, dropping the junk food won’t be enough because the real problem is our grain based diet. I probably need to start eating like my ancestors did and to learn to love liver which is chock […]

    Pingback by Synchronous Food 2 « Villageblog — 8 September 2007 @ 6:15 AM

  7. […] Archaeological evidence shows us that agriculture and grain-eating has had a terrible impact on health conditions. Of course, I feel too lazy to quote it all right now, so why don’t you check out Jason’s essay at Anthropik called; Civilization Makes Us Sick. […]

    Pingback by Urban Scout » Blog Archive » Pizza Vs. Rewilding — 6 November 2007 @ 3:22 PM


Comments

  1. Excellent article. Just a quick point, because this reminds me of something I wanted to post over on the shamanism thread. I have seen it mentioned by a Paleo diet researcher (Cordain) that schizophrenia may be related to grain consumption - apparently populations that don’t eat grains have very low levels of that condition. If that is so, it would mean we are wrong when we include schizophrenia as having some relationship with or relevance to shamanistic states - it’s just another of the many, many, many diseases of civilization.

    Unrelatedly, how can I wean my loved ones away from these toxic Neolithic foods? Everyone believes pro-grain propaganda, and they just turn off when I try to explain this stuff. ‘You need grains for energy!’ etc. All made worse by living in a rice-based food culture. ‘You don’t eat rice? Weird!’

    Aaaaaggghhhhh….

    Comment by Eric — 3 January 2006 @ 12:56 AM

  2. Hey Jason –

    Nice!

    A couple comments, first on endemic/epidemic diseases…

    Wouldn’t it be appropriate to say that endemic diseases are as much, or even more, a gift of civilized lifestyles? After all, endemic diseases are generally epidemic diseases that have adapted to us, and us to them, to a degree that we can abide each other over the long term. (with flare ups). Besides that, it seems that endemic diseases would be virtually impossible to maintain in a population of 30 (or even 150)even if all you looked at were the math…

    On the quote from “Introduction to the Paleo Diet”: I find it interesting that the skin of an apple is portrayed as a ‘barrier protecting the seed’ while the casing of a grain is considered to offere little protection… so we need to get right on that special tool to remove the apple skin, but we can crack open a grain of wheat with our fingers?

    I’m not saying that the general argument is not valid… but there is an unfortunate tendancy for some of these paleo writers to twist things to suit thier needs. (They could have pointed out the indegestibility of the germ, for example, but instead they create a false frame. For shame.)

    Janene

    Comment by Janene — 3 January 2006 @ 9:34 AM

  3. This was your best one yet, IMNSHO. Extremely well done.

    I wanted to add one (or, rather, two) points.

    It’s interesting to note how quickly the host of maladies that were introduced to the Americas burned through the populations. The first reaction is to wonder (1) Why the Europeans transmitted, but never received anything nasty, and (2) If smallpox and other nasty bits hit “virgin soil” in Europe and carried off a fifth to a third of the population a pop, howcumzit that 99% got lixivated by the same viruses in the new world?

    The answer 1 is, as stated above, that most nasty bugs humans get are from animals - that’s why they burn through a population so rapidly. It’s bad business to kill off your hosts so rapidly that you can’t spread because they’ve all, well, died. But if you’re convinced your host is a pig or a cow, you don’t have much choice in the matter. Most off the “original” endemic diseases are actually human diseases; the rest come from domesticated animals. So when the natives didn’t have many (if any, in most cases) domesticated animals, there was nothing to “spread back” to the Europeans.

    Number 2 is a little more interesting and complicated. Of course, a big part of the answer is that the societies were being hit by an enormous complex of diseases all at once. The second reason for the massive mortality rates from these diseases that previously only had a 33% maximum kill-off rating in virgin soil was that most Native groups in the Americas belong to one of four haplogroups, a sort of genetic grouping of (among other things) immune system responses. Eurasia and Africa, on the other hand, had and still have dozens, if not outright hundreds, of haplogroups. This means that if a disease comes through an area, it’s only going to find a maximum of four different ways to fight it. The limited number of haplogroups directly corresponds to the few groups that made their way to the Americas and populated it sometime between 40,000 and 10,000 years ago.

    Once again, this article was fantastic.

    - Chuck

    Comment by Chuck — 3 January 2006 @ 11:30 AM

  4. Well, there is an interesting counter-point to the direction of the germ flow here with syphilis. I saw a NOVA episode that made a fairly convincing argument that Europeans did get syphilis from the Native Americans … but among them, it was pretty close to a head cold. The puritanical culture of the Europeans prompted syphilis to mutate in order to find a better way to infect them, and became an STD. That mutation also made it more virulent, with its modern effects of, well, rotting your brain out. But it was originally airborne and fairly harmless. It was a puritanical attitude towards sex and all those stuffy, maladapted English clothes that made the virus mutate into a virulent, insanity-inducing STD.

    So, the germ flow sometimes went the other way, but like Diamond pointed out, whoever has the domesticated livestock is going to have all the germs, too.

    Comment by Jason Godesky — 3 January 2006 @ 11:44 AM


  5. I have seen it mentioned by a Paleo diet researcher (Cordain) that schizophrenia may be related to grain consumption - apparently populations that don’t eat grains have very low levels of that condition. If that is so, it would mean we are wrong when we include schizophrenia as having some relationship with or relevance to shamanistic states - it’s just another of the many, many, many diseases of civilization.

    I haven’t read this report specifically, but there are a lot of way in which non-agricultural societies differ from our own. So I would think it would be very difficult to pinpoint the cause specifically to grain consumption based solely on that line of reasoning.

    Comment by Mike Godesky — 3 January 2006 @ 12:41 PM

  6. I’ve read somewhere that dental problems among hunter gatherers were much lower than agriculturalists. Actually, here is the quote:

    “Hunter-gatherers presented low rates of dental caries but greater occlusal attrition and higher frequencies of alveolar resorption and antemortem tooth loss. The agricultural samples showed an increase in dental caries. Kennedy also felt his samples suggested that in South Asia the transition to agriculture was associated with increased porotic hyperostosis, caries, abscesses, dental hypoplasia, lines of increased density and a “broad spectrum of diseases”" Hunter, 1984

    Comment by Lope — 3 January 2006 @ 3:04 PM

  7. Absolutely. You don’t see wild animals with cavities or gum disease, do you? The primo, #1 cause of all dental problems is the acidic reaction of sugar in the human mouth. Humans simply are not adapted to eating sugar. When you shift to a carbohydrate (sugar) based diet, your mouth is constantly full of corrosive acids that eat away at your teeth and gums.

    Hell, foragers don’t even need to brush their teeth, but as soon as humans picked up bread, it was all downhill. You see massive tooth decay and tooth loss in Egyptian mummies, for example.

    Comment by Jason Godesky — 3 January 2006 @ 3:38 PM

  8. I’m wondering what type of cow milk you’re referring to when you talk about all of the diseases associated with it. Obviously, the highly processed industrial waste that is advertised as milk is going to do what all industrial waste does.

    What about raw, unprocessed cow’s milk? I know people who are lactose intolerant who can drink raw cow milk and not have any similar reactions to industrial waste. It’s also the same with raw goat milk, and any other dairy products that come form it.

    Comment by Jarcnek — 3 January 2006 @ 6:43 PM

  9. Hi Mike,

    I can’t defend the thesis, as I have not read the research either. I would hope, however, that your point had already occurred to whoever conducted it, being a fairly obvious objection, and fatal if not contested.

    That said, there is plenty of dud research floating about…

    Comment by Eric — 3 January 2006 @ 10:05 PM

  10. Re schizophrenia and grains again, to quote from the paper where I read this:

    ‘It has been more than 30 years since Dohan first formulated the hypothesis that opioid peptides found in the enzymatic digests of cereal grain gluten are a potentiating factor evoking schizophrenia in susceptible genotypes…In a meta-analysis of the more than 50 articles regarding the role of cereal grains in the etiology of schizophrenia published between 1966 and 1990, Lorenz concluded ‘In populations eating little or no wheat, rye and barley, the prevalance of schizophrenia is quite low and about the same regardless of the type of acculturating influence.’ In support of this conclusion are multiple studies which have shown that schizophrenic symptoms improved on gluten-free diets and worsened upon reintroduction. Furthermore, the incidence of schizophrenia is about 30 times higher in celiac patients than in the general population, and schizophrenics have elevated circulating IgA antibodies to gliadin.’

    [Cordain, L. ‘Cereal Grains: Humanity’s Double-Edged Sword’ p57, online at http://www.thepaleodiet.com/articles/Cereal%20article.pdf

    Comment by Eric — 3 January 2006 @ 10:36 PM

  11. Regarding the milk question, a good website to check out is http://www.notmilk.com

    Here is what it says about raw milk:

    I’ll be sure to give Sally a copy of a study that appeared
    in the journal Dairy Science (1999 Dec, 82:12). A study was performed in which raw milk samples from dairy herds were tested. Here is what scientists found:
    “Bulk tank milk from 131 dairy herds in eastern South
    Dakota and western Minnesota were examined for
    coliforms and noncoliform bacteria. Coliforms were
    detected in 62.3% of bulk tank milk samples… nomcoliform bacteria were observed in 76.3% of bulk tank milk.”

    Drink raw milk and you’re not the only one at risk.
    The Los Angeles County report reveals:

    “Although the initial impact of the disease is on the
    individual consumer, many pathogens may be transmitted from person to person, including to family members, and patrons of restaurants if the individual is a food handler. The fetus of a pregnant woman may be at risk. Some of the diseases associated with the pathogens can lead to death, particularly
    among vulnerable persons.”

    The Los Angeles County report cites Centers for Disease Control estimates that no more than one out of 20 cases of food borne illness are reported to local health departments. Such illnesses are epidemic in nature, and rarely reported by the media. Various examples of mass milk poisonings were given in the L.A. study.

    In 1985, an outbreak of listeria was linked to soft
    cheese made from raw milk produced in Los Angeles
    Of the 142 cases reported, 93 were in pregnant women
    or their children. There were 48 deaths, including 20
    fetuses.

    Since 1973, 394 cases of salmonella have been
    reported in Los Angeles County. Of these, 101 (25.6%)
    were consumers of raw milk. Molecular fingerprinting
    identified the strain of bacteria in ill persons as
    the same as that found in raw milk samples.

    Health Benefits of Raw Milk

    A rigorous review of the medical and scientific literature by the L.A. County investigators found no studies suggesting health benefits from consuming raw
    cow’s milk.

    And organic milk:

    Oh yes…organic milk. The healthiest milk from the healthiest cow is naturally loaded with lactoferrins, immunoglobulins, and growth hormones. Horizon’s organic milk contains animal fat and cholesterol, dioxins, and bacteria. The amount of somatic cells (pus) in organic milk is lower than milk from non-organic cows, but it’s still dead white blood cells and dead bacteria. Ask yourself this question. Does organic human breast milk sound like a delicious drink for an adult human? Instinctively, most people know that there are substances in breast milk that are not intended for their adult bodies. Same goes for pig’s milk and dog’s milk. Same for cow’s milk.

    Some people may not be able to tolerate lactose, a milk sugar. One hundred percent of humans are allergic to casein, a milk protein. Eighty percent of the protein in Horizon’s organic dairy products is casein, the same glue used to adhere a label to a bottle of beer. Eat casein and your body produces histamines, then mucous. This sludge congests your organs. Give up all milk and dairy products for just one week and an internal “fog” will lift from your body.

    Is genetically engineered milk dangerous? You bet! Organic milk contains just a little bit less of the same hormones. I find very little difference between the two.

    Comment by Vicky — 3 January 2006 @ 11:46 PM

  12. What about raw, unprocessed cow’s milk? I know people who are lactose intolerant who can drink raw cow milk and not have any similar reactions to industrial waste. It’s also the same with raw goat milk, and any other dairy products that come form it.

    Of all the mammals in the world, only a mutant form of human can digest any kind of milk past puberty. Drinking the milk of any other species is bizarre, and has some pretty awful health effects.

    Now, if you also pour all manner of toxins in it, yes, it is worse. If you pour cyanide into bat piss, that is worse. That does not mean that drinking straight-up bat piss is a good idea, either. It just means that the first idea was even worse.

    If you are old enough to read this, then all milk is probably bad for you. Organic milk is only normally toxic, whereas factory-farmed milk is mega-super, make your spleen bleed toxic. Organic milk only looks good by comparison.

    Why? Because you’re a person, not a cow. And milk is for baby mammals, anyway, which you probably aren’t anymore. It’s good for some animals–namely, baby cows. When you were a baby, human milk was good for you, but not anymore. But never was there a time when cow milk was good for you. Because you are not a cow. You are a human. Stop drinking the cow milk. End of story.

    Comment by Jason Godesky — 4 January 2006 @ 1:03 AM


  13. I can’t defend the thesis, as I have not read the research either. I would hope, however, that your point had already occurred to whoever conducted it, being a fairly obvious objection, and fatal if not contested.

    Well, just a quick search turned up this article. According to this, “Many neurological complications may be associated with immune reactivity to antigens found in cereal grains. It is suspected that autoimmune processes are involved. Even autism and schizophrenia show susceptibilities to grain glutens that aggravate (or even cause) the conditions. There are clinical studies indicating that there is a rapid remission of schizophrenic symptoms by introducing gluten-free diets.”

    That’s interesting. I always figured the lower levels of psychological disorders among non-civilized societies were due largely to lifestyle differences. I wonder how accurate this research is.

    Comment by Mike Godesky — 4 January 2006 @ 1:58 AM

  14. re: Milk…

    the point would be better made if the LA Times had looked at milk that really WAS organic, and not at a product from the Horizon mega-corp, which is not much different from anything else you’d get in any Giant Eagle grocery–

    “Mark Kastel of the Cornucopia Institute, a Wisconsin-based farm policy research group, created a wave of controversy and media attention in the organic dairy industry in February when he filed complaints with the USDA against Horizon Organic, Aurora Dairy and a farm owned by Case Vander Eyk, Jr., alleging that these so-called “organic? dairy companies were in fact engaging in factory farming operations. The Institute’s complaints asked the USDA to investigate whether these three farms were violating the law by milking a large number of cows in a relatively small setting, without legitimate access to pasture, and still labeling the milk organic.

    Kevin O’Rell is Vice President of Research and Development for Horizon Organic, as well as Vice Chair of the NOSB. The Horizon Organic company is owned by Dean Foods, the nation’s largest dairy processor and distributor, and sells more organic milk than any company in the world. In his dual role as Horizon employee and NOSB member, O’Rell for the first time went on the record about this issue.”

    there are also questions about the feed, etc.

    http://wholelifetimes.com/2005/wlt2710/organicmilk2710.html

    still, very interesting essay. but what happens to someone like myself who has put in almost 40 years getting addicted to dairy and grains? where’s the Hunter-Gatherer Recipee book?

    Comment by Librarian — 4 January 2006 @ 3:48 PM

  15. Two things.

    First hing

    Lorenz concluded ‘In populations eating little or no wheat, rye and barley, the prevalance of schizophrenia is quite low and about the same regardless of the type of acculturating influence.

    About the schizophrenia. Are you all familiar with Ergotism? A fungus that propigates itself through infecting rye grain is the cause of this unwanted trip.

    I could see how people who didn’t intend to have an acid-like trip might become schizophrenic, especially after repeated dose.

    Even modern rye, with all their fancy fungicides, cannot fully prevent the spread of this fungus. And since the ergotamine akaloids aren’t as pure as LSD-25 would be, you’ve got potentially toxic doses.

    St. Anthony’s Fire has been attributed to ergotamine poisoning, there are historical times where entire villages were thought to be mad.

    But hey, don’t take my word for it…

    The otherthing about milk and raw milk.

    Raw Milk is said to contain lactase, but apparently, I googled it and that turns out to be false.

    However, there is something really cool about the mutations that allow humans to continually drink milk:

    http://en.wikipedia.org/wiki/Lactose_intolerance

    the part I thought was neat was that Western cats developed the same mutation along with western people, while eastern house cats share their masters lactose intolerance…

    Comment by TonyZ — 4 January 2006 @ 4:02 PM

  16. If it’s got lactose in it, it’s not good for you. Granted, Horizon’s little better than ConAgra, but lactose is lactose and normal adult mammals can’t digest it.

    still, very interesting essay. but what happens to someone like myself who has put in almost 40 years getting addicted to dairy and grains? where’s the Hunter-Gatherer Recipee book?

    This is a good start. I went paleo on Jan. 1. I’ll report in once I’ve stabilized.

    Comment by Jason Godesky — 4 January 2006 @ 4:04 PM

  17. No grains at all would be ideal, but by soaking and/or fermenting grains you can eliminate many of the bad health effects. Here’s a good article about it (except the bit about cheese).

    Comment by Ran — 4 January 2006 @ 4:39 PM

  18. Jason, above you said if you’re human, you shouldn’t be drinking milk, end of story.

    Well, that’s pretty insensitive to the 6000-year history of lactase-producing human adults.

    you’ve got to let the throttle ride every once in a while, man. If you keep punching it you might end up with leaks in your engine block.

    If you want to be smug like the scientists on CSI, I encourage you. Breathtaking intelligence is insipiring. Heavyhanded I-said-so’s only blemish your record.

    Of course, I can agree with this statement:

    Approximately 70% of the global population cannot tolerate lactose in adulthood. Thus, some argue that the terminology should be reversed, lactose intolerance should be seen as the norm, and the minority Western European group should be labeled as having lactase persistence.

    Comment by TonyZ — 4 January 2006 @ 6:00 PM

  19. Just because you’re one of the small, mutant population that is capable of drinking milk doesn’t make it a good idea. All that development gave such people was the ability to drink it, not the ability to do anything useful with it. If I got a mutation that allowed me to drink mud, it wouldn’t make drinking mud a good idea–it would just make it a non-fatal idea. There’s lots of non-fatal ideas that aren’t good ideas.

    Comment by Jason Godesky — 4 January 2006 @ 6:33 PM

  20. There is nothing wrong with exploiting a mutation. I think that our milk mutation probably originated for a good, evolutionary reason. People were living in areas where there was nothing to forage for. They ate anything they could. The survivors developed the milk mutation. People who aren’t dead yet haven’t eaten anything to kill them.

    Now on to milk…
    Our current milk-drinking culture is a by-product of pasteurization and refrigeration. People didn’t historicallly drink raw milk because it was always warm and sour. Who wants to drink that? In certain cold climates with caves, some people developed techniques to control the growth of bacteria in milk and created yogurt and cheese.

    Raw yogurt and raw cheese is very popular in Europe. Raw milk is not. I notice that no one ever cites statistics involving deaths of Europeans due to raw milk.

    Pasteurization kills the milk. That is why it has expiration dates and must be chilled. It is dead and decomposing. Raw milk is still alive. It is no more or no less likely to be contaminated by samonella or listeria than anything else from a farm. I think that here in the US our standards are just so lax that contamination is the norm - that is why we pasteurize.

    Comment by JohnD — 4 January 2006 @ 10:02 PM

  21. Mutations arise for completely random reasons. They proliferate for good, evolutionary reasons. In Sweden and the Middle East, 6kya, they provided agricultural populations an invaluable food source to keep them alive, so the mutation proliferated. It became so prevalent in India, that the use of cows for dairy led to the development of beliefs about the sacred cow.

    As I’ve often said, given a few million years, humans would become adapted to civilization. The proliferation of the lactose tolerance mutation is one example of that.

    However, my point here is that our adaptation to dairy as a food source is still incomplete. We’re adapted enough to be able to use it, but not well. Lactose intolerance is the norm, but even those who don’t suffer from lactose intolerance suffer a whole host of problems of problems. Apart from decomposition and issues of contamination by pesticides and other pollutants is the basic, inescapable fact that cow milk is adapted to cow physiology–not human physiology. Perhaps in a few tens of thousands of years we could adapt to make use of cow milk the same way we do meat, fruit, or nuts, but that will not be for tens of thousands of years.

    Or, we could just eat the enormous, delicious buffet that nature has naturally bequeathed to us as an enormously adaptive omnivore.

    We can eat almost anything on the planet … so naturally, we found the few things we can’t eat, and built a civilization on them.

    Comment by Jason Godesky — 5 January 2006 @ 12:02 AM

  22. Any ideas about how to eat paleolithic without all the crap that goes into meat production? I’ve been vegetarian for a long time now, so I haven’t been checking ways to find healthy meat. Would kosher sources be best?

    Comment by scruff — 5 January 2006 @ 11:23 AM

  23. Hey –

    Organic all the way. Its gotten easy enough to get — especially if you have WHole Foods in your area. I can get ground beef from them for only a dollar or two (per pund) premium.

    Janene

    Comment by Janene — 5 January 2006 @ 12:21 PM

  24. Organic means nothing as above studies attest.

    Raise, hunt, forage, or grow your OWN food.

    do you get it, your OWN food? That’s the point, no middle men. From the food to your belly, with a kiss of flame, perhaps…

    Or try not to get depressed about your taker lot. Cause if it’s in a grocery store, it’s still tainted with Taker Karma. No matter how well the product is greenwashed.

    Whatever is under the yellow sun is for you to eat. What ever is under the blue hum of incandescent lighting is what’s holding you back.

    Comment by TonyZ — 5 January 2006 @ 5:57 PM

  25. Tony - First of all, the USDA organic label itself means nothing. But other labels (that the USDA is desperately trying to destroy) DO mean something. And there are a number of farms that hold themselves to high environmental standards, out of a genuine desire to help.

    It would be nice if we were all capable of hunting and gathering right now. Unfortunately, most of us live in cities and are restricted by jobs (not enough time to learn), money (not enough money for the books/classes), and location (too far away from wild areas to make foraging a part of our day-to-day lives). Or maybe we just shouldn’t eat anything until we’ve learned to hunt and gather…?

    Comment by Giulianna Lamanna — 5 January 2006 @ 6:08 PM

  26. Yes, Tony, but scruff asks a very good question. It’s too much to just ask someone to run off into the woods tomorrow. What are the little steps we can take, that add up to a big change?

    Scruff, eating paleo isn’t something they make easy. I’ve been doing it since Jan. 1, but it can be done. Factory-farmed meat is awful, but at least it’s still meat. Under all the pesticides and what not, there’s still food in there–somewhere. My own feeling is that factory-farmed meat is probably better than organic milk, so that’s where I make my compromise with the realities of my checking account.

    I don’t think kosher would help anything whatsoever. Kosher and halal aren’t about meeting standards that are related in any way to health, environmentalism, or even human evolution–they meet arbitrary cultural taboos that are largely without rhyme or reason. So, eat kosher if you’re Jewish and you want to connect to that. Beyond that, I fail to see any benefit that kosher food could possibly have over anything else.

    As Janene said, there’s always organic meat. Next best thing to something you killed yourself. Though, with the amount of meat you eat on paleo, organic meat may be cost-prohibitive.

    But ultimately, Tony’s right. The ultimate goal is to hunt and gather your food. But along the way, probably the best thing you can do for yourself is to get your body off the maladapted civlized diet and at least feed it the general foods to which it is accustomed. If you really want to do it well, organics are better, but they cost more. If you can’t afford organics, my own view is that it’s better to eat factory-farmed food than organic non-foods, like milk or cereal grains.

    Then, go camping more often.

    Maybe go fishing a little more often.

    Ever want to learn to hunt? :)

    Comment by Jason Godesky — 5 January 2006 @ 6:10 PM

  27. Oh, I understand what the ultimate goal is. It’s just that if I tried to do that right now, I’d die of my own incompetence. Maybe next week :)

    Comment by scruff — 5 January 2006 @ 8:39 PM

  28. Whilst we’re destroying those nasty Taker milk memeplexes, let’s be sure to properly endorse the One Right Way to eat.

    Comment by Chuck — 6 January 2006 @ 12:52 AM

  29. Who said anything of the kind? There’s an unbelievable diversity in forager diets–any kind of meat, red meat, white meat, dark meat, waterfowl, fish … any kind of fruit or berry … leaves, roots, flowers, nuts …

    It’s not ethically wrong, but if you eat poison, or something you can’t really digest, then it’s not going to be healthy. That’s not promoting a “One Right Way,” that’s understanding basic biology. As I said above: “We can eat almost anything on the planet … so naturally, we found the few things we can’t eat, and built a civilization on them.”

    Comment by Jason Godesky — 6 January 2006 @ 1:13 AM

  30. I forgot eggs, insects, crustaceans, snails, mushrooms … there’s way, way too much to list. We’re omnivores, there’s damn little we can’t eat. Pretty much just cereal grains and the milk of other mammals (which, really, is always off-limits, I mean, it’s a different species, fer Chrissakes!)

    Comment by Jason Godesky — 6 January 2006 @ 1:17 AM

  31. Whilst we’re destroying those nasty Taker milk memeplexes, let’s be sure to properly endorse the One Right Way to eat.

    Jason is merely pointing out a Wrong Way to Eat.

    You can probably eat unhealthy amounts of any type of food in any society in any place in the world. But civilization is remarkable because this unhealthy diet is intrinsic and necessary for its sustenance.

    Comment by Devin — 6 January 2006 @ 6:33 AM

  32. the basic, inescapable fact that cow milk is adapted to cow physiology–not human physiology.

    I just wanted to say that the above arguemnt is pretty worthless. Especially when you go on to say that eggs and nuts, etc, are ok to eat.

    Eggs are adapted to embryonic bird physiology. Nuts are adapted to embryonic tree physiology.

    The question isn’t what the food is adapated to, the question is what Humans are adapted to eat.

    JimFive

    Comment by JimFive — 6 January 2006 @ 10:46 AM

  33. (2) If smallpox and other nasty bits hit “virgin soil” in Europe and carried off a fifth to a third of the population a pop, howcumzit that 99% got lixivated by the same viruses in the new world?

    I don’t see that anyone addressed this. From what I recall the answer is that exposure to cowpox conferred some immunity to smallpox.

    Comment by JimFive — 6 January 2006 @ 10:56 AM


  34. I just wanted to say that the above arguemnt is pretty worthless. Especially when you go on to say that eggs and nuts, etc, are ok to eat.

    Eggs are adapted to embryonic bird physiology. Nuts are adapted to embryonic tree physiology.

    The question isn’t what the food is adapated to, the question is what Humans are adapted to eat.

    Well, except that out of milk, eggs, and nuts, only one is produced for the specific purpose of being consumed. Cows, being cows, have evolved to produce milk based on what’s good for a bovine physiology. They unfortunately neglected to take our needs into consideration. To get the same benefit from cow milk that is enjoyed by, you know, cows, the human body would have to adapt to digesting it just like it did eggs and nuts. And while that may not be entirely outside the realm of possibility, it would probably take a little more than a few thousand years.

    Comment by Mike Godesky — 6 January 2006 @ 1:48 PM

  35. It seems like at least I’ve learned that 30 percent of humans are lactose(lactase) persistent, rather than 70 percent of the population being lactose intolerant.

    I’m a little annoyed because I was lied too about Raw milk having lactase in it. But it illustrates to me that green liberals have not completely escaped using non facts as persuasive arguements.

    Well, I have to say, I’m slightly against baby steps. I’m a fan of walking upright.

    You see, when america sent food to a place for the first time, and kept sending it, its almost as if the previous way of life was gone forever. So when america quit sending that food, they starved even more. Many peoples just forgot how to get food sustainably because they were turned on by the food provided for them. Gee, humans forgoing work for no work… never heard of that….We don’t rationalize how we are manipulating mythic cultural belief holarchies when we feed the hungry

    So if a way of eating, thus a way of life, is gone within minutes after the first “provided” meal, then the only thing standing between you and real food is effort, not time.

    Baby step ideology stems from the fact that things take a certain amount of prescribed time to happen.

    But food can change by the meal.

    you could go to WalMart, buy a rifle, and get you some food, today.

    Where’s the baby steps in that?

    I’m not ignoring the background information it takes to get there, but I will henceforth treat any adult learning of tracking, and other skills as remedial, not forward-moving. This is because I was raised with these skills as a child and am now a forwardly mobile human because of this. We all have skill sets that make us adult(or comfort us in the idea of being adult). But a particular skill set that would set right our food problem is taught to every eager Boy Scout who is willing to learn it.

    As to where I want to be, moving forward is a matter of action, not an educational endeavor. So is it understood that if you didin’t learn this stuff as a child, it is your duty to learn it for your children?

    So if you are not standing in a palce of action, then you need to enroll yourself into remedial education. Not all forgot that which was left behind in the Great Forgetting. You shall see in deep reflection that pyramid food production has not suceeded in becoming the one right way.

    I am not an active hunter or an extremely avid fisher(I do have a license in two states) and that is not because of lack of desire, it is because I do not have a home. What sense in hunting deer with no freezer to store it, or people to share it with?

    Being transient has taught me much about the world, but it is not the way of life that I will evetually choose.

    You must own your land
    then you can own your food
    then you can own yourself

    There, those are the adult steps in the development of adult Tribal consciouness. Everything before is prep work (those remedial baby steps for those who need it) everything after the three steps is the reclaimed creative life that praying to the pyramid has stolen from us.

    Comment by TonyZ — 6 January 2006 @ 2:45 PM

  36. Just a quick question about the forbidden foods on the paleo diet: If it’s so bad for us, why does it taste so good? All of our typical “comfort foods” contain neolithic ingredients: macaroni and cheese, red beans and rice, mashed potatoes, ice cream, chocolate, etc. Is it purely cultural, or are their other factors? I’ve read about opioids in dairy and wheat gluten; is there something similar in other grains, potatoes, and beans?

    On the other hand, as you pointed out in the aticle, dark, leafy greens are one of the most nutritious foods we can eat. Yet, most of them are bitter and yucky. I like spinach and broccoli, but turnip greens, dandelion, and brussel sprouts are too strong for me. It just seems odd that the foods that taste the best are the ones that are the worst for us, and the foods that taste the worst are the most nutritious.

    Comment by Vicky — 6 January 2006 @ 3:03 PM

  37. Dear Vicky,

    Taste is relative.Some here would attest that I could take food you absolutely hate and turn it into the best meal you ever had.

    Comment by TonyZ — 6 January 2006 @ 3:09 PM

  38. I would agree with that tony, absolutely.
    Comfort foods are generally foods that make you feel good when you eat them, yes?
    Somehow or another, I’ve turned my snack of sliced up (and sometimes baked) apples with cinnamon and raisins into a mighty fine comfort food.
    Healthy and perfectly paleo acceptable.
    spices and what not are acceptable on the paleo diet as well, so those leafy vegetables, while not tasty to everyone, can quickly become rather tasty with a little bit of tweaking.
    But like Tony said, taste is relative.
    We grew up on grains and those sort of comfort foods.
    It’s about making an adjustment.
    And once you knock out a lot of sugars and salts out of your diet, taste seems to change.

    Comment by Miranda — 6 January 2006 @ 3:37 PM

  39. The question isn’t what the food is adapted to, the question is what Humans are adapted to eat.

    This is, without a doubt, the single best line I’ve ever read about human food consumption. The only thing that I don’t like about it is that I didn’t think of it first.

    It’s not a matter of what’s designed for humans, it a matter of, “What can humans successfully and healthily eat?”

    Well, just about f***ing everything. Including grains. And milk. And brie. And brie and avocado and baguette sandwiches, which I have lovingly named, “Fat Sandwiches.” Try it. They rock.

    But, for the short hand, “Just about f***ing everything.”

    - Chuck

    Comment by Chuck — 6 January 2006 @ 4:14 PM

  40. No, that’s exactly the point, Chuck. We’re not adapted to eating grains, or drinking milk.

    For example, in the article I talked about the lectins in grain. Now, birds are adapted to eating grains, and they have enzymes in their bodies that react to the lectins to neutralize them. Humans are not adapted to eating grains, so we have no such enzymes, so instead lectins result in all the deliterious effects I iterated in the article, namely,

    Lectins–found in cereals, potatoes, and beans–have effects throughout the body as widespread and significant as our own hormones, but originating from outside our bodies, they react with our physiology in ways that are often quite harmful. They can strip off protective mucous tissues, damage the small intestine, form blood clots, make cells react as if stimulated by a random hormone, stimulate cells to secrete random hormones, make cells divide at improper times, cause lymphatic tissues to grow or shrink, enlarge the pancreas, or even induce apoptosis.

    They’re also why civilized human populations are the only animal species with a significant incidence of cancer–because we consume lectins so regularly. Note that for all the pollution we’ve put into the world, we still remain the only animal species with a significant incidence of cancer. Dogs and birds and other mammals are all subjected to those pollutants as much as we are–but only humans on a grain-based diet pumped full of hormone-like substances to which we have no adaptation wind up with significant cancer rates. How do you think a mammalian population eating a food they have no adaptation for that contains chemicals with wide-ranging and totally random effects on every part of their body could end up with a disease that results from anything really random effecting their cells?

    Milk’s in the same category. We haven’t been drinking the milk of other mammals nearly long enough to have the adaptations to do so healthily.

    Obviously, it’s possible to eat these things and not die immediately. My dad also drank from mud puddles (and that may have been better for him). And in tens of thousands of years, we may even become adapted to these foods, so that we can actually derive nutrition from them. But that is not the case today.

    Yes, we can eat damn near anything. We’re omnivores! But it’s not everything–it’s damn near everything. The list of things we can’t eat is actually pretty short: cereal grains, legumes, the milk of other mammals … that’s really pretty much it, innit? The problem is we’ve built a whole civilization on those, and we’ve put it in everything. In the wild, you’d have a hard time finding any of those in sufficient quantities to make a meal of it, so where’s the surprise that they only pop up recently, with agriculture?

    Why do they taste so good? Because we react positively to sugar, because our bodies need some sugar to operate, and that desire drove us to, say, brave a bee hive for the honey. That same adaptation kept us alive in one context, but it damns us in another–when you’ve got a bunch of pasta in front of you, for instance, a meal which simply wasn’t possible in a forager context, so our body has no idea what to do with it.

    Study the way your digestive system works. The way it handles vegetables is incredible. Meat, amazing. Grains–it really has no idea what to do with it. Corn passes through almost completely untouched. We know sugar, so we get that out–but that’s about all we get out of it.

    We have no adaptation to it. When you eat grains or legumes, or drink milk, you’re consuming something that your body doesn’t recognize, doesn’t know how to really digest, and doesn’t know how to properly defend itself against. Nothing is made specifically to be our food, so the question is what have we adapted to make our food. It’s a short list of things that we haven’t, really, but the big stars of that list are grains, legumes, and the milk of other species.

    Comment by Jason Godesky — 6 January 2006 @ 4:32 PM

  41. African Bantu show only 10 percent lactose persistence, while African Tutsi show nearly 80 percent lactose ingestion (Source: Wikipedia)

    Now, there could be, I believe, psycho-sexual arguements against the ingestion of milk. But genetic, maladaptive arguements are debunked by science.

    Globalization makes it harder because in this village, we are trying to find something that EVERYONE can eat, when biologically, we haven’t caught up to the concept of one world diet. Our eyes haven’t caught up tothe myriad of things we now see. We use details and observations to fit our own ends.

    The only possibile simplification of knowledge possible is that nothing is simple. We must challenge our desires to know.

    Comment by TonyZ — 6 January 2006 @ 5:23 PM

  42. Tony, Tony, Tony … you should stick to the profound stuff. There’s no way I can dance around this or mitigate it. You’re wrong. Flat-out, no question, utterly, completely wrong. ALL of the scientific evidence indicates that we are genetically and physically maladapted to drinking any milk into adulthood, and the milk of other species at any age.

    More general pages on the scientific evidence that humans are completely maladapted to consuming the milk of any other species:

    Being able to drink down this stuff without immediately up-chucking is only the most superficial kind of adaptation–and most of us don’t even have that. For those “lucky” few who can digest lactose at all, cow’s milk will still give you all the problems above. That’s not exactly adaptation.

    Comment by Jason Godesky — 6 January 2006 @ 5:44 PM

  43. The list of things we can’t eat is actually pretty short: cereal grains, legumes, the milk of other mammals … that’s really pretty much it, innit

    And yet, you want to make all mushrooms off-limit for your tribe. And for a very good reason too.

    Comment by _Gi — 6 January 2006 @ 6:04 PM

  44. Hey –

    There is one thing I keep noticing about your discussions on food Jason… it tends to be somewhat selective.

    You find it unreasonable to ferment grains or beans to make them edible, but you have no problem at all with soaking the tanins out of acorns to make them edible.

    Perhaps a small blind spot?

    Janene

    Comment by Janene — 6 January 2006 @ 6:07 PM

  45. No–acorns are perfectly edible as-is. I’ve eaten them. They’re just bitter.

    Acorns, uncooked, are perfectly fine, but taste bad.

    Grains, uncooked, taste bad, and will kill you. Big difference there.

    Gi–I, personally, am not a fan of mushrooms. But I don’t make laws for anyone, not even the Tribe of Anthropik. I, personally, don’t like, or trust, mushrooms, so I won’t eat them. I don’t care if someone else does, though. I’ll teach my kids not to trust mushrooms, too, but I won’t ban them. That’s not my place.

    Hell, it’s not even my place to ban wheat or milk. You can eat them all you like; they’re just not food. I’m just warning you, if you do, expect to be disease-ridden and sickly. If you choose to eat these things we can’t really digest, don’t come complaining when you wind up obese and malnourished, and afflicted with a host of diseases, in a general state of poor health.

    Comment by Jason Godesky — 6 January 2006 @ 6:18 PM

  46. The problem with mushrooms is not that people aren’t adapted to eating them. It’s that many species of mushroom are highly toxic. And distinguishing between mushrooms that are okay and mushrooms that will kill you is so difficult that most of the time you’re better off not even bothering with them.

    Comment by Mike Godesky — 6 January 2006 @ 6:44 PM

  47. Excellent article. I particularly was struck by the fact, which I’d never thought about before, that hunter/gatherers didn’t have to brush their teeth.

    This leads me to ask (and maybe this has been covered elsewhere on this site): What would you consider the perfect diet?

    Comment by Martian — 6 January 2006 @ 7:09 PM

  48. The one adapted to your bio-region and descent. :) Seriously, though, there’s no one good answer to that. A healthy mix is usually good, you notice foragers who have more available to them tend to get more of a mix of vegetable and animal sources. All kinds of edible plants like you would not belive, and an incredible diversity of meats … to say nothing of nuts, berries, etc. It’s all far, FAR more diverse than the diet we eat, which is 90%+ wheat, corn and rice. (If you don’t believe, try to make one full meal without using any corn syrup)

    But there’s a huge diversity of very healthy forager diets. There’s the seafood diet, and the big game diet. The !Kung are crazy about their mongongo nuts, but the only vegetables Inuit ever see come out of a caribou’s stomach.

    Really, almost anything you can imagine qualifies as a healthy diet … unless you’re civilized, where everything we’ve managed to domesticate is, well, awful for us.

    So there’s no one perfect diet. There’s no one right way to eat. :) There are poisonous things, and things that are terribly for us, and in civilization, we call those things “staple foods.” :) But ultimately, I’m just advocating we take another look at the vast diversity we left behind, and stop this single-minded fixation we have on the few things that we really can’t eat.

    Comment by Jason Godesky — 6 January 2006 @ 7:21 PM

  49. Hey –

    Are you sure acorns are okay to eat raw? Not a few, but as a staple?

    Anyway,

    It’s all far, FAR more diverse than the diet we eat, which is 90%+ wheat, corn and rice. (If you don’t believe, try to make one full meal without using any corn syrup)

    Its not that hard… unless you are trying to live off of nothing but prepackaged crap! (That was your point?)

    Janene

    Comment by Janene — 6 January 2006 @ 8:44 PM

  50. The effects of a Neolithic diet on health seem fairly well-documented, but I’d be interested to know what effect it has on mindset and behavior. Does anyone know of any research on the subject?

    My theory is that before civilization, Neolithic foods would occasionally be eaten as starvation foods. The consumption of these foods and/or the lack of real food triggered changes in behavior that helped populations survive famine periods - higher brain functions suppressed, people cluster together and become more sedentary, males defer to females, people become novelty-averse and indulge in previously low value repetitive actions, they start to want to ration everything, breeding increases etc. - i.e. they start behaving like people in civilization. This would only be a temporary situation as it is impossible to survive for long on raw Neolithic foods. But then someone had the idea of cooking food, allowing populations to survive on these starvation foods indefinitely. The bodily processes of the people eating these foods continued to think they were starving, which they indeed were at a cellular level, so the people continued on in a behavioral starvation mode, thus producing civilization.

    I’d also be interested to know about the different dietary requirements of men and women. I understand that men do best on a diet consisting almost entirely of animal meat and fat while women, who require a layer of fat to safely breed, would also want to consume some carb-rich foods such as starchy vegetables and fruit. This would explain why those food-groups tend to be culturally regarded as “masculine” and “feminine” respectively, and would suggest that a Neolithic diet is even more harmful to men than to women.

    Comment by Cornfed — 6 January 2006 @ 8:59 PM

  51. What do you (Jason) think about vitamins and minerals?

    Comment by sevenmmm — 6 January 2006 @ 10:16 PM

  52. I have been following a Paleolithic diet for 3 months now and have found it easy to maintain. I see no reason why I would return to eating Neolithic foods. My health has improved dramatically over this period and I do not feel that I am depriving myself in any way. I am convinced that this style of eating should be considered by all civ skeptics as it strikes directly at the base of the dominate culture and undermines the practice of totalitarian agriculture. Forming a dietary tradition can also be an important step in the development of a new culture and being able to offer improved health could be a powerful attractor to the mainstream.

    Comment by Mark — 6 January 2006 @ 11:06 PM

  53. Vitamins and minerals are called by the umbrella term, “micronutrients,” and that’s descriptive. You need them–badly–but you don’t need a lot of them.

    Wild edibles tend to be incredibly rich in vitamins and minerals, far beyond even our “vitamin-rich” foods. One cup of dandelion leaves has more vitamin C than orange juice, for example.

    A diet that’s mostly meat with some wild edibles for sides would not only give you all the protein and energy (from fat, not carbohydrates), but all the vitamins and minerals, too.

    After all, our nutritional needs do come from the forager context of our evolution–and you don’t find many vitamin supplements growing in the savanna, do you?

    Comment by Jason Godesky — 6 January 2006 @ 11:35 PM

  54. Gotta learn to pick your battles, man. You’re going to turn off a lot of people by sticking to the party line. That you feel you must be right is poor justification for telling others that they are unequivocally wrong, especially among a crowd where many do not believe in objective truth.

    - Chuck

    Comment by Chuck — 7 January 2006 @ 1:15 AM

  55. “Party line”? Wouldn’t the “party line” be “Got Milk?”

    My justification for telling Tony he was unequivocally wrong wasn’t my feeling of being right, it was the fact he was unequivocally wrong. He said, quote, “genetic, maladaptive arguements are debunked by science.” That’s completely incorrect. “Genetic, maladaptive arguments” are made by science.

    If you want to believe that milk is good for you, that’s fine. If you want to believe that humans are genetically adapted to drink cow’s milk, go right ahead. Drink it up ’till it kills you (which it will). But you can’t claim that science is on your side there, because it is precisely the science that says we’re not adapted for it. It’s the United Dairy Farmers who say we are–the same folks who paid off the FDA to have dairy elevated to a basic food group.

    Some statements are matters of opinion. Some statements are on controversial issues, where intelligent men can disagree. Some statements fly in the face of all established fact, but we can still accept them as your beliefs, regardless of how, well, idiotic we may think them to be. But you can’t say “Jimmy said ‘yes!’” when, in fact, Jimmy said no. That’s what Tony did above–he claimed that scientific evidence debunked a particular position, when in fact it is precisely that position that scientific evidence supports. So his statement was unequivocally wrong–just like if someone were to go about, saying that I support civilization, drinking milk and intensifying complexity.

    Comment by Jason Godesky — 7 January 2006 @ 1:25 AM

  56. I’ve been Paleo for over a year now. Let me tell you. Walking near the bread asile is enough to make me want to vomit. It doesn’t smell like food to me, it smells like rotting corpses. Sickeningly sweet. Which makes sense, it’s all sugars. No nutritional value at all. There is no amount of bread you can eat (whole grain or otherwise) that can equal the nutritional value of a single dandelion petal. We are adapted to find sugars, because sugars are quick and easy energy. Very efficent. The problems arise when all you eat is quick and easy energy with no vitamins or minerals to speak of.

    Jason, perhaps we’re being too negative with the milk thing. Saying “don’t drink cow’s milk, it’s bad for you” is a negative frame. It puts people off. Perhaps instead we should advocate switching to a milk that is designed for a species more similar to us. Perhaps advocating the consumption of chimpanzee milk would be more popular?

    Comment by Benjamin Shender — 7 January 2006 @ 2:45 AM

  57. And my first reaction to chimpanzee milk?
    Wow, that’s pretty disgusting, but then, I’ve been leaning more towards thinking cow’s breast milk designed for their offspring is pretty gross for us to drink too.

    Comment by Miranda — 7 January 2006 @ 12:33 PM

  58. Hey –

    Jason, since you didn’t answer my last question I decided to look for myself…

    ‘Tannins’ are a general class of Chemicals, ranging from the ‘gallotannin’ found in Acorns — which is quite unhealthy, but should be saved for leather tanning (thus the common name), to flavenoids and other anti-oxydant chemicals in most ‘bright coloured’ fruits and vegetables: blueberries, and other berries, kale and other brassards, grapes, teas, and so forth.

    So yeah, leach and eat all of the acorns you want, but raw… not so much:-)

    Janene

    Comment by Janene — 7 January 2006 @ 12:52 PM

  59. So, how am I still alive? I’ve eaten plenty of them raw. My guess would be that, since foragers have been eating leached acorns since time immemorial, we’ve developed adaptations to eat acorns–the kind of adaptations we don’t have for wheat. The need for processing is usually a good sign that you shouldn’t be eating it, because the basic rule is to stick to the foods that your body knows how to digest. It’s our adaptations that I’ve been hammering on all this time, after all. We digest acorns basically the same as any other nut, and we get a lot out of them, as opposed to wheat, which we strip of some sugar and basically pass all but untouched.

    I never would have thought it would be such a radical, controversial suggestion to say that maybe eating foods that we have no evolutionary adaptation to digest might have negative health consequences. It’s taken for granted in medical and dietary journals; you’d think it would be pretty much a no-brainer, but here we are, 59 comments in, with people telling me how I’m being a food nazi for saying that eating poisonous things we have no adaptation for might be deleterious to your health.

    Is the grain/milk brainwashing really that strong? We can accept that civilization is a great force for destruction, but not that the foods it pushes on us are unhealthy for us? Are we really that addicted to a glass of milk and a slice of toast in the morning, that the idea of bacon and orange juice is so repellent?

    Comment by Jason Godesky — 7 January 2006 @ 1:19 PM

  60. Hey –

    Hold up, bud… I’m not saying that you cannot eat acorns raw… I’m saying that like so many things, eating them raw, in bulk, is not terribly good for you.

    Just a couple months ago, Cory had an unfortunate experiance with Deadly Nightshade, remember? By the name, and chemical composition, you might wonder why he is still alive. The answer is, of course, that in its natural form it isn’t strong enough to be more than discomforting.

    Raw acorns won’t kill you. But they will gradually, over time, eaten in quantity, pickle your intestines. I though you might find that information useful and perhaps worthy of further research. Especially in light of your future plans. But you’re acting like any question of diet that does not fit the notions you have developed is founded on functional blindness. So STOP IT :-)

    As it happens, you have convinced me to get back into research mode, and keep looking at all of this with an open mind. Next week, I get back on my psuedo-paleo, at which point I will be experimenting with variations on complete elimination of dairy, modified dairy, complete elimination of beans and grain, modification of beans and grain… combined with continuing research, we’ll see where I end up. Being one of those that CAN eat dairy without any obvious negative side effects (IE, I have NO allergies, no chronic illnesses and an insanely effective immune system — even though I have smoked for twenty years.), I want to experiment and find out for myself how MY body reacts…

    Is that OK? ;-)

    Janene

    Comment by Janene — 7 January 2006 @ 2:01 PM

  61. Hey, I don’t make rules for anyone by myself. You can imbibe anything you like. I can predict that in your experimentation, no dairy will come out best. Even though you’re lactose tolerant, you’re still drinking cow’s milk–and that puts you at a higher probability for cancer, diabetes, autism, schizophrenia, multiple schlerosis … well, the list is already up there, but the point remains, it’s cow’s milk. And you’re not a baby cow.

    As far as the acorns, that is a very useful tip, and I appreciate it. But you introduced it as an equivalent to grain, as if to suggest that humans are also perfectly adapted to eating grain. Yes, most toxins are non-lethal in sufficiently small quantities, but if I had eaten even half of the volume of grain that I did raw acorns last fall, I would not be alive right now. We digest acorns a lot more than grain. There is no equivalence between the two. We are adapted to eating acorns (though not as adapted as squirrels–nor will we ever likely be), but we are not in the least bit adapted to eating grains.

    Different people will always react differently, but some things are so far beyond the normal distribution curve that personal variation ceases to matter–it’s some kind of awful no matter who you are. Grains, rice, corn and milk are all in that category. You’re free to experiment any which way you like, but that doesn’t make it any more of a good idea, and it doesn’t change the fact that the civilization that nearly forces us to eat these things makes us sick by doing so.

    Comment by Jason Godesky — 7 January 2006 @ 2:42 PM

  62. But you can’t claim that science is on your side there, because it is precisely the science that says we’re not adapted for it.

    You rely heavily on science… but how do you decide what facts and figures to pick out of all the facts and figures that exist out there?

    What data supports your views out of all the conflicting data out there? The correct data? The true data?

    I see correct and true data all over the place.

    - Chuck

    Comment by Chuck — 7 January 2006 @ 4:04 PM

  63. If the data appears to be conflicting, it’s because you haven’t examined it closely enough yet. Interpretations abound, but the facts they rely on are always the same. The question is which metric is the most relevant, and you’ll notice I always attach an argument of why metric X is the most relevant one to look at.

    Comment by Jason Godesky — 7 January 2006 @ 4:18 PM

  64. Hey –

    I don’t think that is completely fair, Jason. I pointed out that you are perfectly willing to eat some things that require processing, but not others. That’s the only way in which I ‘equated’ acorns and grains. We could have just as easily been talking about poke weed.

    But what I think you sometimes overlook when we talk about grains is that some hunter gatherers did eat grains, sometime in our past. Otherwise, we never would have started farming the stuff. Its not like we started farming with tomato plants and THEN discovered grains. Wheat-Rye-Millet were IT.

    I read through ALL of your links above and I saw a couple of specific things that I want to research more, I saw a lot of possible correlations (as opposed to positive causation), and I saw a lot of selective reporting: for example, they point to casien as BAD — but it also exists in human-milk; like IGF-1 growth hormones as BAD — but they also exist in meats; like excess calcium can LEAD to osteoporosis, but H-G diets are generally higher in calcium than modern diets. WTF?

    I don’t know yet which factors are valid, and which are not, but the more I learn, the more I see that ALL of these food studies are biased. Perhaps mine and yours included!

    BTW, you said something earlier about being surprised at how strongly people have reacted to this: but you’re the one always saying that ‘food’ is cultural… so should you really be surprised?

    Anyway, I’m not trying to get ‘down’ on your thesis, but I do have some concerns about some of the research. When I sufficiently understand the chemistry to analyze the raw data myself, only then will I be satisfied. And I know you’re not trying to dictate food choices for everyone, but you are coming off as ‘uniquely right’in this case, if you know what I mean.

    Janene

    Comment by Janene — 7 January 2006 @ 5:13 PM

  65. Yeah, and you know what else, Jason? You rely a lot on “facts” whenever you tell me that the Pope is Catholic. But which “facts” are we talking about here? Some people may be of the opinion that the Pope is Presbyterian. And even though it may not be technically “true,” it sure feels good to say. So what makes you think you get to decide which facts are “correct?”

    After all, isn’t the universe really a democracy wherein reality is determined by popular opinion? Just look at Stephen Colbert. He doesn’t read the news to us. He feels the news at us.

    So stop thinking that you have all the answers just because you possess the capacity for rational thought. That kind of thinking is a relic of the pre-George W. Bush era, and we won’t stand for it.

    Comment by Mike Godesky — 7 January 2006 @ 6:04 PM

  66. Hey –

    Oh, how I wish I had a raspberry emoticon….:-)

    Janene

    Comment by Janene — 7 January 2006 @ 7:13 PM

  67. So stop thinking that you have all the answers just because you possess the capacity for rational thought.

    … well, yes. That’s exactly what I’m getting at.

    People smarter than you and better informed than you have searched for truth before. Don’t think you have a monopoly on it.

    - Chuck

    Comment by Chuck — 7 January 2006 @ 7:32 PM

  68. I think someone missed the sarcasm.

    Comment by Martian — 7 January 2006 @ 7:52 PM

  69. I think someone missed the sarcasm.

    And yet, somehow, it is better this way.

    Hey Janene. I actually use a modified version. Modified for budget that is. What I did is I went strict for a while and then found out which foods were ok and in what quanities. I judged which food was bad based on my reaction to its presence, how my digestive system reacted, and a simple binary measurement of the presence of vomit. I don’t drink straight milk any more, but some cheese is ok for me. I avoid bread usually, as my digestive system does not like it. Potatoes give me a slight problem, but in a soup are not a big deal.

    Comment by Benjamin Shender — 7 January 2006 @ 8:52 PM

  70. Given that food is so cultural, I have no doubt that many of those studies are heavily biased. I expect it. You’ll notice I don’t make much of any one specifically, but instead rely basically on the general logic of, “How long have we been eating it, and how adapted are we to digest it?” It’s the aggregate that interests me more than any one study–it’s not any one bullet in “the Milk List,” it’s the length of it that I find persuasive.

    Yes, some hunter-gatherers did start eating grains a few thousand years ago, and that’s how we got here. But it was a very recent development, and a radical one. In a few tens of thousands of years, we might be adapted to eat that grain (and the milk, and the rice, and the beans, and the potatoes), but none of that will happen in any of our lifetimes.

    Comment by Jason Godesky — 8 January 2006 @ 12:45 AM

  71. “I don’t think that is completely fair, Jason. I pointed out that you are perfectly willing to eat some things that require processing, but not others. That’s the only way in which I ‘equated’ acorns and grains. We could have just as easily been talking about poke weed.

    But what I think you sometimes overlook when we talk about grains is that some hunter gatherers did eat grains, sometime in our past. Otherwise, we never would have started farming the stuff. Its not like we started farming with tomato plants and THEN discovered grains. Wheat-Rye-Millet were IT.

    I read through ALL of your links above and I saw a couple of specific things that I want to research more, I saw a lot of possible correlations (as opposed to positive causation), and I saw a lot of selective reporting: for example, they point to casien as BAD — but it also exists in human-milk; like IGF-1 growth hormones as BAD — but they also exist in meats; like excess calcium can LEAD to osteoporosis, but H-G diets are generally higher in calcium than modern diets. WTF?

    I don’t know yet which factors are valid, and which are not, but the more I learn, the more I see that ALL of these food studies are biased. Perhaps mine and yours included!

    BTW, you said something earlier about being surprised at how strongly people have reacted to this: but you’re the one always saying that ‘food’ is cultural… so should you really be surprised?

    Anyway, I’m not trying to get ‘down’ on your thesis, but I do have some concerns about some of the research. When I sufficiently understand the chemistry to analyze the raw data myself, only then will I be satisfied. And I know you’re not trying to dictate food choices for everyone, but you are coming off as ‘uniquely right’in this case, if you know what I mean.”

    Janene

    Janene, will you marry me?

    -Paleo Boy

    Comment by Paleo Boy — 8 January 2006 @ 3:45 AM

  72. Hey –

    Sorry paleo boy, I’m much to old for you :-)

    Jason -

    Honest question… how do you know when H-G first started using grains? Do you have any good data points showing that? I would be interested to see, because it seems to me that our ancestors, barring evidence to the contrary, may have have been eating wild grains as long as they shared an ecosystem…

    Ben -

    Yeah, dealing with budget is always fun. That certainly plays into some of my resistance as well. If I could afford it, I would just go out and get three distinct meals per meal — one for Ian, one for Jim and Paleo for me… but I’m afraid that craziness will have to wait until AFTER we win the lottery :-)

    Janene

    Comment by Janene — 8 January 2006 @ 9:53 AM

  73. Well, there’s the lack of genetic adaptation. Also, you can’t eat grains raw. They need to be processed. They need to be ground, and cooked, before they become edible. The grinding requires some equipment that shows up in the archaeological record, and usually requires an at least somewhat sedentary population. Those only appear very, very late. Wild grain isn’t terribly easy to gather, much less process, so most hunter-gatherers give it a pass. This all leaves three possibilities:

    1. We only started eating grain very recently.
    2. All this equipment did exist previously, but it’s all been carefully hidden.
    3. Hunter-gatherers ate grain in the past, using some processing technique unknown to anyone today that did not require any large, archaeologically-identifiable remains. 10,000 years ago, a group of people thought this method which required no large equipment was inferior to one where the equipment was enormous. Also, we managed to eat grains for tens of thousands of years without developing any of the adaptations other habitual grain-eaters have developed–like lectin-blocking enzymes. Or the massive population that shifts to an r strategy, to offset the higher death rate from malnutrition and illness inherent in making a food you have no adaptation for a primary staple.

    Comment by Jason Godesky — 8 January 2006 @ 12:33 PM

  74. Hey –

    Well, for us to have a genetic adaptation to eating grains, first a random mutation would have had to occur. The lack of a random event does not seem to be evidence:-)

    Equipment? Two rocks is all you need to process grain — and yes, I would expect archaeologists to find rocks now and again the exhibit the proper wear patterns… except that nomadic peoples probably didn’t keep a grinding stone… if they were doing it, I would expect them to pick up a couple rocks that are handy when they collected some grains… and then immediately discard them again. Who’s gonna carry it around? And why for the gods’ sake?

    Collecting wild grains really isn’t too difficult. Before domestication, the kernals fell off of the stalks very easily, so it would have been a matter of taking a basket and shaking the stalks over it. Take what comes easily and leave the rest.

    I’m not suggesting that hunter gatherers used grains as a major staple. I’m merely suggesting that populations living in areas where wild grains grew, probably used it in the same way that they used other wild plants, collecting it in season, when handy, and processing it in whatever way they found useful.

    I have often thought that cooking may have started indirectly as a result of beans and grains. Soaking is an easy way to soften the kernals — and they may well have soaked nuts and such already. If they left a pot of soaking grains near the fire, it would have been quite obvious that this made it more palatable and more digestable. O’ course, this is nothing that could be proven, and certainly many other scenarios are equally probable… its just something that occured to me once upon a time.

    Anyway, I was hoping you might have something more concrete… that perhaps wild grains (as we know of them) only appeared 15K years ago, or something like that.

    Janene

    Comment by Janene — 8 January 2006 @ 1:17 PM

  75. Obviously not. There are lots of species that are adapted to eating grains. Lots of birds, even in our own family, with Paranthropus bosei. They all have certain things in common, adaptations to eating grains. A specific kind of stomach, or large, flat teeth, etc. They have chemicals to deal with grains, block lectins, that kind of thing. We have none of those. P. bosei went in a different direction a long time ago; it’s not one of our ancestors, it died out.

    Random events happen with a given probability, so over a given time period, the probability approaches 1. If there’s a natural selection for it, it will prosper. If we’ve been eating grains for hundreds of thousands of years, then at some point someone would have appeared with a mutation that allowed her to secrete lectin blockers. She wouldn’t get sick from grain the way the rest of us do, so she’d be healthier, more fertile, and live longer–and so would her children. If we’ve been eating grains for tens of thousands of years, we should have some kind of adaptation to eating grains–just like every other species that’s been eating grain for hundreds of thousands of years. We don’t, so the obvious implication is, we haven’t been eating grains for very long–that’s why we look like we haven’t been eating grains for very long. We are still Pleistocene animals.

    To make grain worthwhile, you need a big stand of it, and you need to process it in bulk. Yes, stalks come off easily, but there’s not all that much on a stalk. To make the effort worthwhile, you need to take a whole bunch of stalks at once and process them in bulk. That’s what makes gathering wild grain difficult. You can’t do it just with a few hand-held stones; you need large milling stones to do that kind of volume where the effort becomes worthwhile, otherwise the marginal return is abysmal for primates that can’t digest it raw. You’d probably lose more calories than you’d get otherwise.

    You certainly need cooking before you can eat grains–which makes me think that it probably wasn’t for grains, but that we started eating grains after we started cooking for some other reason. Even now you can recognize these different steps of technological integration: first you use it to do some known task more efficiently, then you use it to do something new. Cooking grains would be something new.

    Comment by Jason Godesky — 8 January 2006 @ 1:51 PM

  76. Can imagine a scenario of primitives over hunting and gathering a large area, then forced to eat less nourishing and/or the poisionous just to survive the moment. The lectin may have made them sick, but some did live off it.

    I there a reference to wild edibles?

    Comment by sevenmmm — 8 January 2006 @ 5:20 PM

  77. Hey Jason –

    There are four reasons why similar adaptations appear repeatedly in the animal kingdom. None of them come anywhere near ‘because if the animal does X behavior it is virtually assured that it will adapt to do X better.’

    The chance of a specific genetic mutation occuring (from the perspective of the first homo sapeins to stick some grain in his mouth) is probably somewhere in the neighborhood of 1/1,000,000,000 — even over the course of ten thousand plus years. This is NOT on of those things that becomes highly probable given time — the only thing highly probably is the the species WILL adapt more useful traits… but as soon as you point to one in particular, you cannot make odds.

    Yes, we are still pliestocene animals, and yes, you need a big stand of wheat (etc) to make it worthwhile… but wheat is a first stage succession plant — it ONLY grows in ecologically damaged areas, and it ONLY grows in large stands.

    Wild wheat, like so many other grasses, would have dropped its seeds in response to blowing wind, or the rustle of passing animals. So when a person came a long and grabbed a handful of still-growing-stalks, and gave it a shake over a basket, perhaps 50% of the kernals would have fallen off. Over time, this ratio got to be less and less and more work would have been required. But certainly, by then, our ancestors would have already been planting the stuff… or they never would have started.

    And the one thing you refuse to ackowledge is that I am not talking about grains-as-staple. I keep saying that. If it is so ‘certain’ that we would have adapted lectin blockers, why would be not also have adapted ‘cyanide blockers’ or ‘tannin bloackers’? After all, we eat nuts!

    What ‘volume’ is worthwhile? You’re talking about ‘civilized’ concepts of scale. Yes, if you want to get the ‘best bang for your buck’ then you need large grinding stones, etc. But if you want a bowl of poridge, then it doesn’t matter how much work it takes, cos you WANT it. But more importantly, the original wild strains were much more simple to process. Threshing may not have been neccessary at all (as compared with the multiple stage threshing later), I don’t know, but I would be interested to go back to my notes and see how thick or thin the kernal was in wild emmer, how large the berry was, how much waste would have been needed to be removed in processing, etc… And/Or it may be that grains were mostly only used in times of scarcity. If EVERYTHING took more effort and time, then it would become more ‘economical’ to process.

    The bit on cooking was simply a random thought… you’re right that it was probably adapted to make some job easier… perhaps charring bones to make the marrow easier to access.

    Janene

    Comment by Janene — 9 January 2006 @ 9:41 AM

  78. We eat nuts, but acorns and almonds specifically are fairly recent, and never were a staple. I realize you’re talking about how we transitioned into this way of life, and I think for that you make a fair argument. But you seem to be framing that as if it proves that we’ve been doing so for a very long time, and I don’t think that follows.

    Lectin-blockers are just one example; there are any number of grain adaptations we don’t have. We don’t have the really big, flat teeth of P. bosei either, for example. There’s a whole line-up of grain adaptations you see in every other grain-eating mammal, but not in humans.

    In sufficient moderation, most anything is within the limits of our tolerance for survival (though drinking milk into adulthood seems like a fairly perverse kind of obsession with childhood–and I don’t even want to think about the psychological makeup that would make somebody try suckling from a cow….), and grains may well have served as a periodic starvation food, just like pine nuts did in North America. When you’re starving, you eat whatever you can get a hold of.

    That’s a very different thing than making it into a “staff of life” that you live off of as a staple, so I think we’ve left the original topic of whether or not this is a healthy lifestyle far, far behind, and ranged into the subject of how we ever got ourselves into this mess to begin with. But I don’t think you can really disprove my thesis–”Civilization makes us sick”–from that angle.

    Comment by Jason Godesky — 9 January 2006 @ 11:02 AM

  79. Hey Jason –

    That all makes sense to me, and you’re right, we have strayed far from the original thesis.

    I think my whole point here was meant to be that the thesis will be stronger if you ackowledge that it is not ‘cut and dried’, ‘absolute’, etc ad infinitum.

    Some grains in our diet may not be any worse than some beans, some cheese (I’m currently researching how making cheese alters the chemistry of milk — cos quite frankly that is one thing that I DON’T WANT to give up :-) ), some acorns and some poke weed….etc.

    So I’m not trying to change your mind, just to get you to lighten up a bit;-)

    Janene

    Comment by Janene — 9 January 2006 @ 11:12 AM

  80. Ah, OK. I guess we’ve had some miscommunication. You say, “We can eat grains!” and I hear it as a diet of just bread and pasta is a perfectly healthy way to go. You hear me saying grains as a staple is a bad idea, and hear grains are t3h d3v1l’s f00d!!!! :)

    But I think we can all agree that grains, beans and potatoes really aren’t very high on the list of things we deal with very well, and whatever a “healthy” level of them may be, it’s sure as hell not anything close to the way we eat them now.

    That and, drinking milk is weird and gross.

    Comment by Jason Godesky — 9 January 2006 @ 11:17 AM

  81. Yes, we can agree on that… but I still want cheese :-)

    Janene

    Comment by Janene — 9 January 2006 @ 11:31 AM

  82. Don’t we all. :)

    Never fear, there’s a whole world of food out there. I’m working on a paleo brownie. :-D

    Comment by Jason Godesky — 9 January 2006 @ 11:50 AM

  83. Hey –

    hmmm…

    2 C nut flour
    1/2 c cocoa
    2 Eggs
    1/2 c oil (NOT soy/’vegetable oil’)
    1/2 c honey
    1/2 c chopped walnuts

    Am I forgetting anything? ‘Course I’m pulling the quantities out of my ass — I’d have to look at a real recipe to see if those need to be adjusted :-)

    Janene

    Comment by Janene — 9 January 2006 @ 12:05 PM

  84. Oh, yeah… salt, vanilla extract and baking powder — what’s baking powder made of? Anyone?

    -J

    Comment by Janene — 9 January 2006 @ 12:06 PM

  85. Well, chocolate and vanilla are beans…

    Baking powder is made of baking soda and cream of tartar.
    Don’t ask me where those come from though…

    Comment by WackyMorningDJ — 9 January 2006 @ 12:15 PM

  86. It’s a rock. Sodium bicarbonate. Not something we ate in the paleolithic, most likely. But not really harmful, I think….

    Cream of tartar is also known as Potassium tartrate. Again, not paleo. Seems that it’s used mostly to make baking powder, preserve wine, syrup, and vegetables, and to clean metals. Apparently it’s a laxative, and Sodium bicarbonate is a anacid. Told you chocolate helped your digestion.

    Comment by Anonymous — 9 January 2006 @ 1:41 PM

  87. That was me.

    Comment by Benjamin Shender — 9 January 2006 @ 1:42 PM

  88. Hey Mike –

    Well, vanila is not a legume — its actually an orchid… cocoa, on the other hand is a tree… so I think that the name ‘bean’ may be a misnomer in both cases.

    Meanwhile, it looks like cream of tartar is a residue of wine making, while baking soda is pure sodium bicarbonate, which apparently does occur naturally as ‘nahcolite’

    So, question is, is wine allowed?

    Janene

    Comment by Janene — 9 January 2006 @ 1:46 PM

  89. Janene,

    I reckon so. We must have been munching on rotten fruit since we were monkeys. Plus the fact that new research keeps on appearing saying that wine and other alchoholic stuff is good for you in moderation. Having said that, there are a number of businesses that might be concerned to promote such conclusions - but hey, at least it’s a conclusion I like.

    Comment by Clive — 9 January 2006 @ 2:23 PM

  90. Janene,

    I think the question is how much wine is allowed.

    Comment by Benjamin Shender — 9 January 2006 @ 2:29 PM

  91. Wine is OK on paleo. :)

    Comment by Jason Godesky — 9 January 2006 @ 2:43 PM

  92. Wow, longest Anthropik thread ever.

    Every food causes something to go wrong in someone.

    your arguement FOR nuts ignores nut allergies. Nuts could kill someone.

    And so what if you can prove that something may hurt someone. Meat gives some people heart diease. SOme vegetables give some peole painful gas. Breathing air gives some people an asthma attack. is it a matter of adaptation?

    But JAson, you are wrong about something. I know it hurts, but you ignored this statement made above:

    African Bantu show only 10 percent lactose persistence, while African Tutsi show nearly 80 percent lactose ingestion (Source: Wikipedia)

    Okay, two non-european groups on the same continent. One produces lactase, which, dspite health effects, MEAN THEY ARE ADAPTED. Their mother genetic structure did not include lactase production, but now it does in 80 percent of one tribe, who happens to be historically tribes of cow herders. The other tribe shows only “mutant” levels of genetic lactase production, and happens to be historically, hunter-gatherer tribes.

    So where am I wrong? Am I wrong that 80 percent of a population produces lactase, and is therefore genetically adaptive to drink THE MILK OF A COW.

    They didn’t have the gene that made lactase production possible, now they did, THEY ADAPTED.

    Just becuase something isn’t good for you doesn’t mean you aren’t ADAPTED to it. Are we now talking degrees? yes. You may have the enzyme but that doesn’t take care of everything.

    So, if we aren’t “adapted” to eating something, then we are using “adaptation” as our eating frame. Adaptation is so god damned subjective, aren’t we the better is jsut saying there is no right or wrong way to eat, just as long as there is a no ONE way?

    I’m saying fuck adaptation as a measurement of utility. Don’t we have enough adaptation to do in the present?

    Mike,

    3 mushrooms will kill you for sure. Two are white amanitas with two distiguishing features. Dont’ eat anything with a sac or a veil (oyu can look that up if you want to know more) the other is a Galerina autumnalis, a little tiny orange mushroom that grows in groups.

    So all you have to do is this:

    Promise me you’ll never eat an all-white mushroom with a sac and a viel, and you’ll never eat little tiny orange ones.

    Then after that, I’ll tell you you can eat anything you want as long as it tastes good to you. So now you know what will kill you. Keep in mind there are other fungi that some people can eat while others cannot, it’s all a method of experimentation.

    Even if you did just eat any mushroom, you are more likely to go talk to God than go visit Him.

    Comment by TonyZ — 9 January 2006 @ 3:12 PM

  93. Some people, recently, have begun to develop nut allergies.

    No person has ever been adapted to drinking milk.

    Being adapted to a food means more than just being able to consume it without up-chucking (which is all that “lactose tolerance” means); it also means that it won’t wreck your health.

    The Tutsis and Bantu were both heavily influenced by Arabs in the first mllennium CE, and probably even before. Lactose tolerance in Africa came from the Middle East, where it allowed the Bantu to produce a large number of sickly warriors and conquer the southern half of the continent, because it doesn’t much matter how weak and unhealthy they are if there’s a few million of them. Sound familiar?

    So, they’re not adapted at all. None of us are. Milk doesn’t make some of us sick; it makes all of us sick. It’s not a food for humans. It’s on the short list.

    Comment by Jason Godesky — 9 January 2006 @ 3:19 PM

  94. Jason,

    I believe if we were in person this would be an easy matter to settle, however, pardon me for dragging this out.

    I mean, obviously, we’re working with two minds thinking about the same thing in two different ways.

    so no need to keep going in circles here, but here’s my official stance:

    The production of lactase in individuals should be treated as a partial adaptation. Not a perfect one, but certainly the body doesn’t change the genetic codes if the need doesn’t arise. Even if it were a matter of selection, lactase production has been selected, and in my opinion, making it a adaptation. Much like I have an alarm clock that has adapted me to working 9-5, but it still is killing me.

    Regardless, I still eat sour cream(fermented, tyvm) on a daily basis, and am not morally opposed to cheese. Milkshakes makes me sick. All sentient-being industry makes me ill.

    Comment by TonyZ — 9 January 2006 @ 3:55 PM

  95. OK … here’s my official stance.

    Drinking milk beyond puberty is abnormal and maladaptive. Drinking the milk of another species is simply bizarre. But, do anything long enough, and you’re bound to adapt to it. Drinking cow’s milk is an incredibly awful idea, but some of us have developed the most meager and superficial of imaginable adaptations. Even those will still suffer from consuming something so clearly not for human consumption. That gives each of us, lactose intolerant or not, two optons. We can either continue pursuing this bizarre, maladaptive strategy and hope that in a few million years our descendants might be fully adapted to it, or we can just eat like a normal human being. Since there’s nothing really good about drinking another animal’s milk, and a lot of awful things about domestication, I see no reason why anyone would want to spend any amount of effort pursuing the former course, when the latter is the common-sense, healthiest, and most natural course. It’s a lot of effort, yes, but it’s for an awful cause.

    Comment by Jason Godesky — 9 January 2006 @ 4:17 PM

  96. Jason,

    I was in the store today, and I saw yogurt with active cultures. I asked the wife what the purpose of that was, and she told me that the cultures eat the lactose upon hitting the stomach, making the product digestible, even to people with lactose intolerance. I looked it up when I got home. She was, as usual, right.

    All food has to be processed to some degree - whether it’s the simple processing of pulling it off a tree and grinding it between your teeth, to the more complex cooking of meat, to the even more complex leaching of tannins from acorns. Somewhere in there is yogurt and cheese and chowing down on an occasional bowl of couscous and quinoa.

    Your argument against milk and grains seems to hinge on the idea that humans should not consume foods to which our bodies have not adapted; that such a thing is bad business. But where do you draw the line on processing? Last I checked, humans had a VERY difficult time processing raw meat and I wouldn’t even dream of shoving inner tree bark straight from the pine down my gullet. Since humans can’t eat these foods without processing, are they therefore bad? Should we abstain from any form of food that cannot be eaten “as is?” Since we have evolved/adapted no method of digesting raw tree bark, raw wild boar rump, raw milk and raw grain, should we shun these foods?

    Humans never developed the ability to completely deal with lactose or lectins, but we never developed the ability to completely deal with tannins or raw meat, either. The question is what level of preparation is required. All these foods can be bad for humans when not prepared properly. So why rage against milk and grains? Why draw the line there, when they can be a healthy part of a diet when properly prepared?

    You are uncomfortable with the idea of consuming another animal’s milk, especially after childhood. You dislike grains because they are the cornerstone of civilization. To select foods such as these and label them “bad foods” based on your personal feelings towards them fails to take the whole picture into account.

    - Chuck

    Comment by Chuck — 9 January 2006 @ 9:25 PM

  97. Been thinking alot about these ideas lately and believe the argument is; adapting to agriculture is a credit to our complex society, but is not self-sustaining - leading to civilizations ultimate collapse.

    Comment by sevenmmm — 9 January 2006 @ 9:51 PM

  98. Chuck,

    Eating can never be “processing.” Processing is what you need to do between gathering and eating so that it won’t kill you. Only a civilized person could ever say with a straight face, “All food has to be processed to some degree,” because we’re the only things on the planet that eat things we’re not adapted to eat on a regular basis.

    The problem isn’t the processing, but the maladaptation. There is nothing ambiguous about this distinction. If has not a thing to do with my feelings. If we’re adapted to eat it, eat it. If we’re not, don’t. As a general rule of thumb, if you can eat it raw, it’s probably OK to serve up any way you please. The processing is just a big ol’ red flag that maybe this isn’t the best thing in the world for you to begin with. Your examples of raw meat and tree bark are totally fallacious, though. Your inability to eat them has nothing to do with your physical abilities. It is a conditioned disgust response. People eat both raw meat and tree bark all the time, and are healthier for it. I myself have dined on raw meat and raw tree bark, but never raw grains. Eat unprocessed grain, and you will die, regardless of your upbringing. Drink milk, and if you’re not one of those “lucky” few with a particular mutation, you will get sick, regardless of your upbringing. Since I’m talking about physical adaptation, comparisons to culturally-constructed disgust responses is totally irrelevant.

    I actually knew that about yogurt. I believe the cheese-making process also breaks down lactose into lactic acid. But my feelings about this developed in response to these facts, not vice versa. If anything, my feelings are quite the opposite. Until very recently, I was as much a bread-and-milk addict as any of you. Breaking that gave me friggin’ DT’s. If there was any semi-rational justification to keep getting my fix, I’d probably cling to it like any addict. But there’s not.

    Seven,

    Cancer is a “credit”? Well, I suppose you’re free to interpret it that way, but I would say that the introduction of cancer was probably more along the lines of “bad.”

    We haven’t adapted to agriculture. That’s why we have cancer, and disease, and stress-related deaths. These are not normal. In any other species, we would recognize these immediately as the most obvious signs of an animal living in conditions it is completely not adapted to. So why is it so controversial to apply that standard to ourselves? Because it might mean eggs and bacon for breakfast, instead of wheaties in milk?

    Bear in mind, these are also the foods that bind us. Their production and processing is centralized. The price of a loaf of bread is war, genocide, and totalitarian political control. Seems a little steep for a loaf of bread when a juicy cut of venison is free, don’t it? Not only do we suffer for these foods–they’re not even good foods for us to begin with!

    But they are addictive, so a good first step towards your own emancipation would probably be to break that addiction. Once you’re on paleo, and the smell of a bakery is enough to make you sick, the essentially toxic nature of Neolithic foods should become self-evident.

    Comment by Jason Godesky — 9 January 2006 @ 10:20 PM

  99. Jason, you reacted quite the opposite to what I had invisioned. Guess my idea was too much in a form of a riddle.

    Credit = civilization, but civilization = collapse.

    I’m using the word credit much like you use the word collapse (which to a primivist would be good).

    Comment by sevenmmm — 9 January 2006 @ 11:19 PM

  100. Jason -

    You’ve eaten raw meat, straight from the carcass on the ground into your mouth? Eaten right where the animal was killed, minutes after the animal’s heart stopped beating?

    Almost everything humans eat is processed in some way, shape or form. The level of processing that makes a food item healthy may be great or minimal, but in the end, it still makes it healthy. I submit yogurt and soaked quinoa.

    - Chuck

    Comment by Chuck — 10 January 2006 @ 10:02 AM

  101. Grains must be processed in order to be non-lethal.

    Even after it’s processed, grains still give us cancer and a whole long list of other maladies.

    From this, you conclude, “but in the end, it still makes it healthy.”

    No–it might make it non-lethal, but it doesn’t always make it healthy. You can consider bread “healthy” only if you consider cancer to be healthy.

    There’s a difference between “healthy” and “non-lethal.” Contrary to proverbial wisdom, whatever kills you does not necessarily make you stronger.

    Comment by Jason Godesky — 10 January 2006 @ 10:09 AM

  102. Raw meat is tough to chew. If you tried to fulfill your energy requirements with raw meet alone, you’d be chewing all day

    Comment by Anonymous — 10 January 2006 @ 1:23 PM

  103. I don’t like the consistency, myself, so I cook it. Being descended from scavengers as we are, cooked meat is preferable–it has a consistency that more closely resembles rotted flesh. But plenty of people live quite healthily on raw meat, and have for quite some time.

    Comment by Jason Godesky — 10 January 2006 @ 2:28 PM

  104. Hey –

    Pretty much all ‘traditional’ cultures have some form of raw animal food in thier repetoire. Sometimes organ meats, sometimes ground (steak tartar is AWESOME!), and of course sushi…

    As for me… I have often said that I probably would like fresh meat, still warm… when I eat steak, its always seared on the outside but red as you can make it on the inside:-)

    Janene

    (So, did I just gross anyone out? Sorry.)

    Comment by Janene — 10 January 2006 @ 5:13 PM

  105. No you sound like my girlfriend, though ;)

    I think that I see Jason’s overall point is that it’s not just about adaptations.

    Don’t we have a little thing called an appendix? ANd why is it so dang vestigal?

    Comment by TonyZ — 10 January 2006 @ 6:24 PM

  106. On the subject of the chewiness of raw meat, the best cuts are fatty chicken and turkey breasts, as well as various species of fish, which melt in the mouth. Pork fillets are also quite good. Steak is indeed chewy and must generally be tenderized. I suspect that wild humans probably just cut the most tender bits out of the animals they killed and left the rest.

    I’m not sure what point was being made about the appendix above, but it is in fact not a useless vestige. In healthy humans it serves as a “gut associated lymphatic tissue” whose function is to detect intestinal parasites in order to trigger increased mucus production and other defense mechanisms.

    Comment by Cornfed — 10 January 2006 @ 7:09 PM

  107. Tony — There’s nothing maladaptive about a vestigial organ. It doesn’t work against you, so there’s nothing to select against it. Maladaptive behaviors actually work against you.

    Ever notice that only humans have to brush their teeth? Ever wonder why? The chemicals in our mouth react to grains to make them acidic, causing tooth decay. If we were adapted to eating grains, our teeth would have much thicker enamel (like P. bosei), or the chemistry of our mouth would change, or both. Instead, grains make our teeth rot out of our face.

    Comment by Jason Godesky — 10 January 2006 @ 8:40 PM

  108. Jason -

    You’ve eaten truly raw meat, straight from the carcass on the ground into your mouth? Eaten it right where the animal was killed, minutes after the animal’s heart stopped beating, flesh still dripping with hot, not warm, but hot, steaming blood?

    Was the blood drained off, even slightly? Had the meat been soaked in water? Had the flesh been kept cool (by whatever method) to prevent bacterial growth after it was killed? What about how long it had been dead? Was there time for any parasites that may have been inside to die? Was the meat treated with any sort of herbs or acids? Did you eat anything else with this truly raw meat that may have aided in its digestion?

    - Chuck

    Comment by Chuck — 10 January 2006 @ 8:53 PM

  109. That’s a much more particular definition of raw than I’m used to. No, never quite that immediate. Cutting a chunk off of a carcass counts as “processing”?

    Comment by Jason Godesky — 10 January 2006 @ 8:58 PM

  110. Chuck, you’re completely missing the point. Jason isn’t saying that grains are bad for us because we can’t eat them raw. He’s not a raw foodist. He’s saying that grains are bad for us because grains are bad for us. They’re not healthy, they greatly increase one’s risk of cancer, they offer no nutrition, they make us fat, etc. etc. etc.

    The “if we don’t cook them, they’re deadly” thing was hardly the basis of his entire argument. It was just an interesting sidenote that you’ve spun completely out of proportion seemingly because you can’t come up with any arguments against his actual point.

    Note that I said “seemingly.” If you can convincingly argue that grains are healthy and therefore worth eating, please do so now and drop this ridiculous strawman.

    Comment by Giulianna Lamanna — 10 January 2006 @ 9:02 PM

  111. Giuli -

    Fair enough. I can see how it might look like a straw man, although that wasn’t my intention. I always try to figure in conversational drift in particularly long threads like this one. It looks like the drift happened, but since I wasn’t vocal enough, only I was “on the new topic”. It makes for embarrassing misunderstandings.

    At this point, I’m not arguing that grains and milk as civilization presents them aren’t inherently bad for us, because I’ve been shown overwhelming evidence for it. Rather, I’m trying to put out ideas on what sort of processing scale might be used to define “bad for us,” because I see the matter as more than simply black and white, healthy and unhealthy. I suggest that there’s a spectrum of healthy to unhealthy, based on the amount of processing required to make edible, or something along those lines.

    I wouldn’t call cutting a chunk of meat off a carcass processing, but pretty much everything beyond that I would, because it is some form of processing. Processing is anything done to food to make the food healthier and/or easier to eat.

    Grains and milk have to be processed a lot more than meat in order to become healthy to eat. Grains must be sprouted and milk made into yogurt or certain cheeses, whereas meat just has to be drained of blood a little bit, be cooked, or be allowed to sit a few days. I see no evidence for sprouted grains and/or yogurt causing cancer or being unhealthy in the ways described above. I also see no evidence for truly, truly raw meat being healthy, because I doubt any comprehensive studies on truly raw meat-eating have been conducted.

    Some foods, like carrots or apples, require absolutely no processing, and have no evidence of adverse health effects. Pick, eat. Some foods, like meat, I don’t have any information about the health effects off truly-raw eating, but I can guess that they might have negative effects. Some foods, like grains and dairy products, can have terrifying effects if not properly processed, but are harmless, and can even provide excellent nutrition, if properly processed.

    I find the idea of a scale of healthful processing to be food for thought. (NPI). Take it for what it’s worth. In the very near future, people will be struggling to find food in any form. Knowing how to healthily consume foods that are on the bad side of the processing spectrum (like grains and dairy) might mean the difference between life and death for a lot of people when these are the only foods they can obtain, especially with so many diseases going around. Healthy equals good immune system and robust endurance. This is way more of a practical application than the essay intended, but I find the idea of a scale of healthful processing to be food for thought. NPI. Take it for what it’s worth.

    - Chuck

    Comment by Chuck — 11 January 2006 @ 8:24 PM

  112. OK, so let me go on record now for saying, there is no line after which processing is bad. Processing doesn’t necessarily have a thing to do with how healthy a food is; it’s just a good rule of thumb. Especially if preparing it improperly can kill you.

    That’s not true of raw meat, though. You’re wrong, many studies have been done of raw meat and the result is–they’re fantastic. Assuming by “raw” you allow for cutting it off the carcass, and maybe the amount of time it takes to transport it. That’s precisely what humans have eaten for most of our evolutionary lives. Cooking actually kills a lot of the nutrients in meat, making it less nutritious, but as Cordain showed in the PDF I have in the Vault, foragers didn’t just get most of their protein from meat, they also got most of their energy from meat, too. And for the most part, and until recently, that meat was raw.

    I like my meat cooked, though. Sure, it’s not quite as nutritious, but it’s still on the positive side of the spectrum.

    I’m baffled as to how you can continue to suggest that grains and milk are healthy, though. There’s absolutely no evidence to say that they’re healthy, but mountains of evidence that they’re hideously unhealthy. My bit on the processing was to highlight that, even intuitively, we should know they’re bad news: it takes all this processing just to make grains non-lethal, and even then they’re still bad for us.

    Ever seen a fat forager? That’s not because they didn’t eat well. Quite the opposite, in fact. We did get carbs in our Paleolithic diet–but nowhere near the scale we have now. After properly processing, most of the anti-nutrients in grain should be broken down, for the most part. There’s always some lectins and neurotoxins left, though. But even then, what you have in a loaf of bread is pretty much just a bunch of empty carbs. It’s cheap energy, yes, but that’s not something you can live off of. There are other things in there that nourish other animals, but humans can’t digest any of them, so we just pass them out. The only thing we get out of bread is sugar. Frankly, you’d probably be better off just with a bucket of sugar–at least then you wouldn’t be eating a carcinogen (which is what bread is).

    As I’ll be exploring in a thesis very soon, for those for whom bread and dairy is an option, there won’t be any food at all. For those of us for whom it isn’t an option, we’ll already have a buffet of delicious, healthy foods to choose from. Surviving the collapse is a fairly simple matter: simply end your dependence on civilization. That means ending your dependence on cereal grains and dairy, and a good first step for that is to understand what they’ve been doing to your body all this time.

    Comment by Jason Godesky — 11 January 2006 @ 8:53 PM

  113. Okay. :) I have neither the time nor the inclination to look it up, but since you’ve always pretty much been right, I’ll take your word for it.

    Fun argument, anyway.

    - Chuck

    Comment by Chuck — 11 January 2006 @ 10:49 PM

  114. I should add as a trailer that your general idea is reasonable, your methodology sound, and I would agree with you about a spectrum of healthiness, preparation, etc.

    The only thing I think you got wrong is just how far down the scale grain and cow’s milk are–somewhere in the neighborhood nightshade, bat piss and slime. But that’s stuff we’re trained from birth specifically not to know. I know I was shocked and incredulous the first time I heard that grain might not be the most wonderfulest thing I’ve ever eaten.

    Comment by Jason Godesky — 11 January 2006 @ 11:35 PM

  115. I don’t think a lot of people will be eating huge amounts of grain in the future, not because it’s bad for you (although it certainly is some nasty stuff). People won’t be eating lots of grains because you just can’t get as much food per acre out of grain as you can from other food production methods. Agriculture, and especially post-green revolution agriculture, relies partly on sheer land volume to make its massive surpluses - and with non-burnt-out farmable land at a premium, who’s going to have the space to grow wasteful grain and raise inefficient cattle?

    - Chuck

    Comment by Chuck — 12 January 2006 @ 9:52 AM

  116. That, and we killed all the agricultural land–which will make it impossible in the short term. In the long term, the Holocene’s coming to an end–whether for hotter or colder is hard to say yet. But whatever it is, it’s going to be a very different climate, and one not terribly suited to agriculture.

    I agree, on the society level, the health issue will have little to do with it. That’s just a personal motivator.

    Comment by Jason Godesky — 12 January 2006 @ 10:14 AM

  117. What about breakfast cereals with added vitamins and minerals?
    They have the energy and the nutrition.

    Comment by _Gi — 12 January 2006 @ 3:01 PM

  118. You are joking, right? This thread has thrown my sarcasmeter all out of whack.

    Comment by Jason Godesky — 12 January 2006 @ 3:07 PM

  119. This is probably beating a dead horse, but I just read this in my anatomy textbook and thought it was relevant:
    “In addition, the stomachs of newborn infants (but not of adults) produce rennin and gastric lipase, enzymes important for the digestion of milk. Rennin coagulates milk proteins; gastric lipase initiates the digestion of milk fats.”

    Miranda -
    I tried your apple, raisin, cinnamon recipe and it was delicious. My daughter loves it too.

    Janene -
    You said that hunter gatherer diets have more calcium than modern diets. Where does the calcium come from? Bones? Vegetables? This is an honest question because (especially as the parent of a young child) I am constantly bombarded by people telling me that I and my daughter HAVE to drink a certain amount of milk or milk substitutes in order to be healthy.

    Comment by Vicky — 16 January 2006 @ 1:02 PM

  120. Wild edible plants, actually, have enormous amounts of calcium. Other vitamins and minerals, too. A few dandelion leaves can have more vitamin C than a glass of orange juice.

    Comment by Jason Godesky — 16 January 2006 @ 1:06 PM

  121. Only wild plants? Because when I told my doctor that I don’t drink milk, but eat plenty of green leafy vegetables he said that I don’t get enough calcium and need to take supplements. Wa he misinformed? I know that doctors don’t get a lot of training in nutrition.

    Comment by Vicky — 16 January 2006 @ 9:23 PM

  122. Hey Vicky –

    I’m no nutritionist, but a good general rule of thumb is the darker the greens, the heavier the nutrient load…

    I used to raise iguanas and learned a lot about thier needs in the process… they need a calcium-potassium ration of 3-1 and the best way to meet that was with mustard greens, turnip greens, kale etc. Spinach is okay, most lettuces are worthless… so your doctor may be misinformed, or he may be assuming ‘greens’ means ‘lettuce’…

    The bit about calcium, specifically, came from one of the ‘Milk is bad for you’ links that Jason offered up the page a ways. Unfortunately, I don’t remember which one… except it was NOT the one saying that calcium ‘overdose’ leads to osteoporosis :-)

    Janene

    Comment by Janene — 16 January 2006 @ 9:46 PM

  123. In general, the vegetables we domesticated also happened to be the most worthless, nutritionally-bankrupt plants available. That was typically made worse by the process of domestication. In fact, I saw a study that we’ve actually gotten significantly worse just since the 1970s.

    Comment by Jason Godesky — 16 January 2006 @ 10:44 PM

  124. Hey Jason –

    I’ve been doing some more research this week, now that I am getting back on to psuedo-paleo. So, I am wondering what your thoughts are on tubers?

    I know white potatoes are a no-no — but how has your research stacked up on sweet potatoes, sunchokes, rutabagas, etc?

    Most of what I have found indicates that the die-hard paleo folks say NO, but the foods themselves are edible raw and perhaps part of our diet since a. afarensis…

    Janene

    Comment by Janene — 17 January 2006 @ 7:35 PM

  125. Really? Paleo folks don’t like tubers? I’d assumed tubers were on the A-list. Is it just potato-obsessed fanaticism?

    Comment by Jason Godesky — 17 January 2006 @ 8:06 PM

  126. Hey –

    hmm… okay, I guess that answers my question:-)

    I think it was paleofood.com that said ‘no potatoes, no sweet potatoes, no sunchokes’… when I got to sunchokes, I said wha’huh?

    I think it may be a cross-contamination from the lo-carb people. Sure, tubers are starchy, so if your intent is to lose as much weight as quickly as possible, then I can see it. But when you skip the tubers and fruit, it also leads to weak in the knees and that’s no way to live…

    (One site even said that squashes are sometimes considered off limits ????)

    Janene

    Comment by Janene — 17 January 2006 @ 8:41 PM

  127. Thanks for all the knowledge, information and inspiration, folks (tubers and legumes - what great names!). For the first time in nearly a decade, my kitchen smells of searing, organic flesh…

    Official terminology for my haphazardous approach? Paleoccasional

    Comment by JCamasto — 17 January 2006 @ 9:35 PM

  128. Potatoes are just a big block of energy. Besides, they only taste good when you fry them to death in oil or smoother them with butter, cheese, and sour cream. So…what are you eating? Other than that, they only work in soups. My opinion at least.

    Janene, who said no fruit?

    Comment by Benjamin Shender — 17 January 2006 @ 11:37 PM

  129. Potatoes are a great source of potassium, vitamin C and vitamin B6. And they’re delicious :-).

    Comment by Vicky — 19 January 2006 @ 9:00 AM

  130. Hey –

    Potatoes also have glycoalkaloids that can be toxic, even deadly, in sufficient quantity. Don’t ever eat raw white potatoes and avoid any white potato that has begun to ‘green’. Concentrations are highest in the skin of the potato — which is also where you find most of the vitamins and minerals… double bind :-0

    Ben — ‘no fruit’ is strictly an Atkins/Low Carb Diet refrain — you never see it in paleo lit.

    Janene

    Comment by Janene — 19 January 2006 @ 9:56 AM

  131. Hey –

    Saw something interesting on a Discover show this weekend.

    Apparently, some Swedish Dentistry Researchers are experimenting with a lacto-bacilli spray as an alternative to traditional cavity treatments and preventatives.

    Sounds like another little bonus to eating at least a little bit of lacto-fermented foods in your diet… anyone want homemade ginger ale? :-)

    Janene

    Comment by Janene — 24 January 2006 @ 9:24 AM

  132. Or, rather than eating things you can’t really digest to help alleviate some of the negative effects of another thing you eat but can’t really digest, you could always just not eat cereal grains in the first place, so then you don’t have to worry about cavities at all. :)

    Comment by Jason Godesky — 24 January 2006 @ 9:51 AM

  133. Hey –

    Well, there are lots of things that you can lacto-ferment… the ginger ale is one, I have also heard a suggestion that sauerkraut has been shown to raise resistance to bird flu — and its fermented…

    Point is, more reason to consider lacto-bacilli our friends :-)

    And since we have mostly eaten a ‘civilized’ diet for most of our lives, our teeth already have lost a good portion of thier enamel… so why not take advantage of the boost? Would you rather get an existing cavity filled, or treat it with natural enzmes to prevent continued decay? (not to mention, getting cavities filled may get really difficult here, pretty soon…)

    Janene

    Comment by Janene — 24 January 2006 @ 10:12 AM

  134. Wow - This is definitely one of the best theses so far.

    This morning I refused to have milk with my cereal (of which I chose the least grain-filled option lol). And at lunchtime for the first time I can ever remember I prepared myself a salad instead of having a sandwich lol.

    I’ve never ever thought of changing my eating habits like that before! Thanks for the useful information.

    Sean

    Comment by Sean Case — 1 February 2006 @ 11:05 AM

  135. Jason said:

    “In fact, I saw a study that we’ve actually gotten significantly worse just since the 1970s.”

    —————

    I recently read about this in the guardian (UK). The newsreport can be found at this address: http://www.commondreams.org/headlines06/0202-06.htm

    Sorry, I haven’t been able to figure out the coding on this site to place hyperlinks or do quotes yet.

    Comment by Roxy — 4 February 2006 @ 2:39 PM

  136. Hey –

    Good news.

    I’ve been experimenting with those paleo-brownies… and I think I nailed it (paeticularly if, like me, you prefer chewy, unfrosted brownies)

    2 C ground raw sunflower seeds (or other nut)
    1/2 C Cocoa powder
    1 t baking soda
    1 t baking powder
    1 Egg
    2/3 C honey
    1/4 C vegie oil
    (optional)
    1/2 c semi-sweet chocolate pieces (not strictly paleo, so its up to you)
    1/2 c chopped walnuts

    Mix the dry ingredients really well, then add egg and honey. Add enough oil to make a moist paste (not a batter). Mix in nuts and chocolate if desired.

    Spread into a 9×6 pan (or similar)

    Bake at 325 for about 20 minutes (watch closely, it should be barely ‘dry’ on the surface. This recipe burns easily)

    Unless you have a food mill, the sunflower seeds will still be a little chunky and so you will see white specs in the brownies. It doesn’t detract, however, in fact I think it adds a little additional nutty goodness :-)

    Janene

    Comment by Janene — 12 February 2006 @ 11:54 AM

  137. All right everyone, awesome thesis and very interesting discussion on diet.

    I am really surprised that there was absolutely no mention of the “Eat based on your (blood) type” studies that make an attempt to categorize foods that are good/neutral/bad based on your blood type.

    Blood types developed during the periods of human emigration from Africa, and as they adapted to the various new bioregions they diets changed, and hence blood type O cannot deal with such things a corn, wheat, etc. but I can eat all the meats i can find… while some of the other blood types can quite readily digest more parts of the corn, wheat, rye, tubers, but get heart disease when they eat a lot of meat…

    The point I’m trying to make is this:

    Jason - you make good points and it sounds like you are of blood type O or maybe A, and your arguments hold true(er) for you than other that have argued your points.

    Everyone else - everyone that seems to favor the grains, tubers, milk diet, you might be more in the blood type B or AB, and hence less meat and more grains maybe be your preference…

    I’m not a nationalist either, I’m a computer tech, but I CAN diagnose some of my body’s symptoms. I have recently cut out corn and peanuts (both on the avoid list and after some trial and error) and reduced recurring migraine headaches to once or twice a month from several times a week… so it works for me… I suggest you check out the following:

    http://www.dadamo.com/
    http://www.diets-reviewed.com/bloodtypediet.html
    http://www.onlinediets.biz/bloodtypediet/
    http://www.chasefreedom.com/eatrightforyourtype.html

    Here is a food list site for all the blood types:http://er4yt.beggerlybend.com/contents.html

    And here is a site that offers blood type recipes (Jason - you might find some nice game recipes here)
    http://www.recipenet.org/health/er4yt.htm

    Jason - keep up the good work, I will continue on with the rest of the thesis later…

    Everyone else - hope you all figure our your best possible diet in this not so perfect world

    Rich

    Comment by Rich — 15 February 2006 @ 12:33 AM

  138. Jason - you make good points and it sounds like you are of blood type O or maybe A, and your arguments hold true(er) for you than other that have argued your points.

    Actually, I believe I’m B. :)

    But there’s a reason I put so much stock into the paleo diet and none of the other “fad” diets out there, including the blood types (which I have heard of before, and didn’t bring up for reasons I’ll get to presently).

    Namely, the paleo diet isn’t based in any kind of contemporary studies, which are ultimately based on correlations, which are always pretty ambiguous. Correlations never prove causation, no matter what the science reporters say. But the paleo diet is based on our understanding of human evolution. That’s a far more stable foundation.

    Blood types come from the antibodies in the blood. They provide a diversity that protects groups from disease and infection. They weren’t restricted geographically, but equally distributed. You won’t find a tribe of all people with A, and another with all AB, and another with all B–you get a tribe with some A, and some B, and some AB, and some O.

    They didn’t know who had what blood type, and they certainly didn’t eat according to it. They all ate the same food, together, as a group. So, how could such a thing ever develop, as a question of evolution? Some things are superfluous to evolutionary adaptation, but things involving diet are the most basic, necessary evolutionary adaptations you can ask for. We adapt to a given diet very closely. How did B or AB blood-type people ever survive the millions of years when grains weren’t a major viable food source, and nearly the entire diet came from meat?

    Comment by Jason Godesky — 15 February 2006 @ 9:14 AM

  139. Jason:

    I have read a theory that Paleolithic blood type was “O? and that the other types evolved later. If this is true then a correlation between blood type and the ability to digest and utilize different foods could be related to the geographic location of ancestors. How do you know that blood types were evenly distributed in those times?

    Comment by Bob Harrison — 15 February 2006 @ 11:13 AM

  140. Well, that wouldn’t make much sense–other primates have blood types. For instance, that “positive” or “negative” in blood type? That’s Rh factor. Rh here stands for “Rhesus,” as in “Rhesus monkey,” because that’s where it was first discovered.

    So, humans started out with just one blood type, and then after the Agricultural Revolution we developed the same kind of blood types that are found in all other primates? That seems like a pretty silly scenario to me. Can you find a source for that, Bob?

    Comment by Jason Godesky — 15 February 2006 @ 11:28 AM

  141. This is my source: http://www.dadamo.com/napharm/store3/template2/encyclopedia.html. I am very dubious about this article as the author has something to sell. You might check his references.
    I was asking how you knew that blood types were “equally distributed?. I don’t see that the fact that both primates and we have a gene for Rh factor has much to do with human types O, A, and B which are determined by other genes.

    Comment by Bob Harrison — 15 February 2006 @ 12:27 PM

  142. OK, “equally distributed” isn’t quite right. But they’re not geographically set, either. Most tribes today have a mix of blood types. Chimp troops have a mix of blood types. Based on this, I would presume that most human groups throughout history have had a pretty good mix of blood types.

    Comment by Jason Godesky — 15 February 2006 @ 1:21 PM

  143. Do you also presume, but not know, that Paleolithic tribes had blood types other than “O? and Rh+/-?

    Comment by Bob Harrison — 15 February 2006 @ 2:59 PM

  144. In the same sense that I presume, but don’t know, that Paleolithic tribes were human, rather than reptilian space aliens with a penchant for burying primate bones and a bizarre sense of humor playing a practical joke that would take two million years to pan out. That is, given the evidence we have, Ockham’s Razor indicates that Paleolithic tribes had a healthy mix of blood types, just like humans today, and other primate species in the wild.

    To suggest otherwise would mean that when humans descended from other primates, we lost all but one blood type, survived in such a decrepit state for millions of years (groups with just one blood type would be horribly prone to disease, after all), and then, once we reached the Neolithic, not only developed new antigens (and thus, new blood types), but developed the very same blood types as our primate cousins, whom we’d separated from millions of years before.

    Which explanation do you think is more likely?

    In that sense, yes, I “know” it. Blood doesn’t fossilize very well.

    That said, while my case may rely on Ockham’s Razor, I can’t see a single shred of viable evidence for this idea that blood types only re-appeared in humans a few thousand years ago.

    Comment by Jason Godesky — 15 February 2006 @ 3:26 PM

  145. Ooo! I know a group over here that subscribes to the Reptilian Alien theory. They’re currently controlling the White House with their insidious blood sacrifices and sex orgies.

    Oh yeah… and they’re putting fluoride in our water as part of their mind-control experiments.

    No, seriously — fluoride. >

    :)

    Bill Maxwell

    Comment by Bill Maxwell — 15 February 2006 @ 4:59 PM

  146. I have found a reference that critiques the “Blood Type Diet? with facts. It seems that types A and B rather than O were likely the originals in humans. As I said I had doubts about the accuracy of D’Adamo’s theory. I wanted something stronger than your opinion, Jason, to critique it. http://www.cybermacro.com/forum/showthread.php?t=186

    Bill: Do you consider anyone who questions Jason to be as ridiculous as those who are concerned with alien reptiles?

    Comment by Bob Harrison — 15 February 2006 @ 7:59 PM

  147. I’ve run into this before as well. I was told that type A people were best suited for agriculturally based diets. I replied that I was type A and that would mean that paleo would have made we weak as a kitten, rather than giving me the best health I’ve ever enjoyed. I seem to recall once being told that native americans had a high concentration of type O, but I have no resource for that, only a vague memory from school.

    Bob, those alien reptiles are a major concern you know. Making light of them is dangerous. :)

    Rich, interestingly enough, both peanuts and margrine are off limits on the paleodiet as well. The paleodiet has a record of helping people with a host of problems, including migraines, diabeties, cancer, and obesity-induced foot pain.

    Comment by Benjamin Shender — 15 February 2006 @ 8:59 PM

  148. “Bill: Do you consider anyone who questions Jason to be as ridiculous as those who are concerned with alien reptiles?”

    Bob,

    Hell no! The comment on the reptile aliens (as weird as it seems) was legitimate. I ran into this hip-hop club on my local college campus who are interested in sustainability and raw food. Some real nice guys but when you talk to them long enough, you find out some of them are followers of David Icke (http://en.wikipedia.org/wiki/David_Icke). It was damned funny to listen to. (I really sincerely like these guys but reptile aliens make me laugh).

    Best

    Bill Maxwell

    Comment by Bill Maxwell — 15 February 2006 @ 9:25 PM

  149. The space aliens aren’t referring to people who disagree with me. All the people I respect most disagree with me on at least one major premise. I consciously seek out people who disagree with me–but I also expect intelligent disagreement, not blindly accepting anything that’s contrarian.

    The space aliens were illustrating the purpose of Ockham’s Razor. Ockham’s Razor is the principle that separates Newton from invisible gremlins.

    The forum thread you presented doesn’t appear to have any evidence besides what I offered, so I’m not sure why you’re talking about my “opinion.”

    Comment by Jason Godesky — 15 February 2006 @ 10:08 PM

  150. Excuse me, but what CAN we eat?

    I can’t digest milk, so I use soy. Now I read that Soy is also bad for you.
    So I’m switching to mushrooms. I get them from people who grow them and know how to avoid selling poisonous ones.

    But the MAIN beef (to coin a phrase) that I have with this thread is:

    We live to be 80+ years nowadays. Hunter gatherers’ average age at death was 26.

    I am not sure I understand how a hunter-gatherer’s life is supposed to be such an improvement.

    Comment by organix — 19 February 2006 @ 4:15 PM

  151. Excuse me, but what CAN we eat?

    Pretty much everything–any kind of meat, nearly any kind of vegetable. Drinking the milk of other species is never going to be a good idea, and soy’s on the short list with legumes, grains, potatoes, peanuts and cashews of plants we really have no adaptations to deal with.

    If you’re a hunter-gatherer, it’s hard to find anything that you can’t eat. Unfortunately, our civilization can only domesticate those very things we can’t do anything with, so your local grocery store isn’t going to be easy.

    We live to be 80+ years nowadays. Hunter gatherers’ average age at death was 26.

    Not true in the least. See Thesis #25: Civilization reduces quality of life for a complete discussion of comparative life expectancies. Hunter-gatherers live at least as long, and arguably longer.

    I am not sure I understand how a hunter-gatherer’s life is supposed to be such an improvement.

    Live longer, with a stronger community, more free time, less work, less stress, healthier, with a higher standard of living across every possible criteria imaginagle. That’s how.

    Comment by Jason Godesky — 19 February 2006 @ 5:29 PM

  152. “Eat unprocessed grain, and you will die, regardless of your upbringing.” In what quantities? I grew up in the wheat belt and as a child ate raw wheat berries by the handful. I still eat oatmeal raw - too slimy when cooked. [Does squashing the grain flat process it enough to make it suddenly non-poisonous?] I don’t get noticably sick from it, much less die.
    You claim not to have gotten sick from having eaten raw acorns, but in that you were looking at the short term effects same as my eating raw wheat. In arguing against grains and milk, you seem to be focusing on the long term effects. Sure, it’s not wise to eat those foods in quantities.
    I would argue that it’s not wise for any human to rely solely on a few foods OF ANY TYPE for health and well being. The Inuit traditionally survived largely on the meat of large sea mammals. However rickets and short stature have been endemic in their population. Does that mean that humans are not adapted to eating the meat of sea mammals? NO. It means that humans are not adapted to eating a limited variety in our diets.
    While I have not doubt that eating a Paleo diet removing access to grains, dairy and legumes produces healthier individuals… Is it not possible that that is the case because it forces us to consume a greater variety of foods since we don’t simply then slide into taking the easy way out and and consuming what is so abundantly available that we eat it to the extent that we starve our tissues of other needed nutrients?
    I am certainly interested in trying a paleo diet, but I will not be absolutely ruling out any edible for my consumption. My personal rule of thumb has been and remains: Eat A Wide Variety Of Minimally Processed Foods. [re: the sideline about processing - note I didn’t say necessarily unprocessed]
    I do appreciate though having it pointed out how completely dependant the civilized diet is on grains and legumes. I will continue to consume dairy products, but have never been a big milk drinker - I’m tolerant, but have sisters and a kid who are not.
    Oh and as for the contention that eating grain requires huge stone mills to grind in sufficient quantities to consume… what about the small hand grind stones of the Southwestern US tribes? While not something I’d want to lug around on the off chance that’s what I’d find to eat, it shows that huge mills are unnecessary and the processes Janene outlined is reasonable.
    On white potatoes. They taste good raw. I used to eat them that way all the time as a kid. It’s only when they’re cooked that they have to be doctored up to taste decent.
    Noticing I’m referring to what I did as a kid again… my diet was much less civilized then than now… thus that frame of reference.
    Oh and on the subject of “we can eat pretty much anything except….” what about wood and grass? People cannot even consider eating those things, though they are staples for some other species. The ones named as inedible are not inedible, they are marginally edible.

    Comment by ChandraShakti — 24 February 2006 @ 5:01 PM

  153. PubMed citations: 1, 2, 3

    Comment by Jason Godesky — 28 February 2006 @ 5:32 PM

  154. Hi Jason,
    Have to get on this Acorn thread.

    I agree some Acorn, generally from the White Oak family, has very little tannin taste-sometimes, each individual tree is unique. And yes, you can eat a few of them raw without removing the tannin. The Red Oak family - the Acorn is very high in tannins, and yes you can tan hides with tannin, I have done it using Black oak bark and the liquid from the Acorn. (Trade secret, or it was) The process of removing the tannin from the Acorn is known as leaching. This leached liquid is very medicinal for cuts and scrapes, rinsing your mouth out with, good for your gums as an astringent, yes those folks brushed their teeth using different forms of tannin,…the tannin makes your mouth pucker. Tannin is very constipating!!! if you ingest to much it can be very hard on the kidney’s and liver. Tannin is taken at the most for three days to treat kidney and liver related ailments…but no longer, it will make you sick!

    The earliest penicillin type medicine came from fermented Acorn mash, long before that fellow Pastur’e in France. The Yosemite Indians used it. Hidden in the Oak is some of the most powerful antibiotics known. But I believe not to many folks know about this.

    Comment by Fernman — 11 March 2006 @ 9:37 PM

  155. I dislike this notion that once a species is “adapted” to feed on a particular type of food then it is all sunshine and roses and holding hands. All species are in a constant state of struggle with the other species that they rely on for food. All prey (usually plants) have some form of chemical defense to being eaten, and all predators have some mechanisms to counteract the defenses. But no system of neutralisation is perfect, and the plants defenses are always changing. Domestication of plants short circuited the latter mechanism by allowing us to select non-toxic strains, such as sweet almonds that lack cyanide.

    Even monarch butterfly caterpillars suffer to some degree from the toxic alkaloids they are adapted to eat. Adaptation only finds adequate solutions. Once a creature has changed enough to survive and compete there is no further mechanism to improve. In short there are no perfect foods. Adapting to eat one wild food source perfectly means extreme specialisation, and the associated long term evolutionary risks. Koalas are adapted to eat toxic gum leaves but still suffer gradual brain damage from the diet, and most species of gums are toxic enough to be inedible to them, and they are set up to die out when their food source dies out. Generalists are the survivors in the long run.

    Human digestion, and mastication for that matter, are unusually weak compared to most other species. Its our ability to change the way we handle food stuffs before we stuff them in our face that has allowed us to be so adaptable and resilient. Looking at what the Australian aborigines had to do to survive is an example in the extreme….most foodstuffs had to be leached in a river then cooked to a crisp first.

    I believe a lot of the problems with modern staples arent a reflection of their inherent biochemical compositions or our shared genetic pasts but rather a consequence of the extreme levels of over production, industrial processing and prolonged storage required to make the support of our enormous population possible. Problems with cavities related to flour consumption skyrocketed when industrial flour production started. Real flour has a good balance of essential fats and fat soluble vitamins, and isnt ground as finely. Real whole flour goes off in a couple of days as these perishable elements react with the air. Ive started incorporating live grains into my diet and previous symptoms that were probably related to low level B-vitamin deficiencies have since disappeared. Processed grains have synthetic B-vitamins added just to the levels to prevent acute vitamin deficiency diseases. Oat and millet flour, ground and made into batter then cooked lightly into simple pancakes have had no detrimental effects on me. Whats more even “organic” grain is extremely cheap today, so its an economical way to improve your diet (and get some exercise turning the hand mill).

    It seems to me the reaction to our current dietary culture, which is mostly centred around refined carbohydrates, is in danger of swinging too far in the other direction with all these throw away quotes about the wonders of a cup of dandelion leaves. Carbohydrates arent evil per se- a diet that is too narrow and extreme in any direction comes at a price. Anything can be toxic in excess…even dandelion leaves and acorns.

    Comment by Shane — 17 March 2006 @ 2:04 AM

  156. All species are in a constant state of struggle with the other species that they rely on for food. All prey (usually plants) have some form of chemical defense to being eaten, and all predators have some mechanisms to counteract the defenses.

    That’s not true. Part of the adaptation is sometimes to neutralize the poisons that plants produce to resist being eaten (like the lectin-blocking enzymes that birds produce to safely eat grains), but it’s just as often a strategy of co-evolution. Plants often benefit from being eaten, as animals can be a great way to spread seeds and propogate. At the same time that animals adapt to make better use of the plants as food, the food adapts to become a better food for the animal. The animal gets food; the plant is propogated. Evolution is not always about “constant struggle.” Just as often, it’s about two species learning to form a symbiosis that benefits them both.

    Domestication of plants short circuited the latter mechanism by allowing us to select non-toxic strains, such as sweet almonds that lack cyanide.

    But did so, as you say, imperfectly. Most of our domesticates are at least mildly toxic, even if we ignore the issue of pesticides and other pollutants.

    Once a creature has changed enough to survive and compete there is no further mechanism to improve.

    That is incorrect. Marginal improvements are really the thing evolution does most. A marginal improvement can increase the birth rate even a tiny percent, which can result in it completely overtaking the population in short order.

    I believe a lot of the problems with modern staples arent a reflection of their inherent biochemical compositions or our shared genetic pasts but rather a consequence of the extreme levels of over production, industrial processing and prolonged storage required to make the support of our enormous population possible.

    Yet it was only with the introduction of these “problems” that our health began to improve. We live almost as long as foragers now, thanks to those “problems.” Those “problems” greatly improved our diet. Heavily processed flour may destroy all the nutrients, but more importantly, it destroys many of the anti-nutrients, like lectin. Such “affluent malnutrition” is the reason we no longer suffer the kind of mortality seen in the Middle Ages. It’s still malnutrition, but at least we get a lot by volume.

    Problems with cavities related to flour consumption skyrocketed when industrial flour production started.

    No, they did not. Look at archaeological digs. Few early agriculturalists had any teeth left at all: and they typically died in their 30s.

    Real flour has a good balance of essential fats and fat soluble vitamins, and isnt ground as finely.

    No, it doesn’t. It’s actually pretty impoverished in terms of nutritional content. Compare flour to, say, a dandelion. Flour is almost pure sugar, with little protein and even fewer vitamins. To get enough vitamins from flour, you would need to eat almost nothing but flour, all day long. And then, of course, you get an extreme overdose of sugar.

    It’s precisely because flour is so lacking in vitamins that earlier agriculturalists were, as a rule, disease. You see skyrocketing rates of various maladies arising from lack of vitamins among populations that depended on bread as the “staff of life.”

    Carbohydrates arent evil per se- a diet that is too narrow and extreme in any direction comes at a price.

    I tend to agree with that. A healthy human diet is going to include carbohydrates–but far, far less than we get today. Furthermore, it should mostly come from fruits. Grains, if ever used at all, should probably be confined to emergencies.

    Comment by Jason Godesky — 17 March 2006 @ 10:43 AM

  157. There are two kinds of relationships between organisms where one eats the other. In the first the victim doesnt benefit from it and has every incentive to make life as difficult as possible for the other, hence the usual arms race of defense and counter defense. In the other situation, for example where a fruit is sweetened to aid in its dispersal or nectar is produced to encourage pollination the plants do everything they can to get the best deal possible, which means mostly giving away cheap sugar and minimising the amount of protein and mineral sacrificed. Fruit and nectar are not perfect foods.

    Shane-Domestication of plants short circuited the latter mechanism by allowing us to select non-toxic strains, such as sweet almonds that lack cyanide.

    Jason-But did so, as you say, imperfectly. Most of our domesticates are at least mildly toxic, even if we ignore the issue of pesticides and other pollutants.

    Shane-It seems to me you are arguing in opposite directions here. The previous point was that natural co-evolution leads to mutually beneficial synergies, yet here you argue that agricultural co-evolution of man and crops always leads to residually toxic relationships….clarification perhaps?

    Shane-Once a creature has changed enough to survive and compete there is no further mechanism to improve.

    Jason-That is incorrect. Marginal improvements are really the thing evolution does most. A marginal improvement can increase the birth rate even a tiny percent, which can result in it completely overtaking the population in short order.

    Shane-The real problem in nature is that organisms are always under conflicting selective pressures and are limited by resource availability. Every adaptation comes at the cost of relinquishing other adaptations. Overspecialisation is death also..being a perfect leaf eater closes the evolutionary door to going back to other modes of living. The coupling between genotype and phenotype isnt precise either, for example the persistence of recessive lethal genes. Evolution is far from efficient, but that turns out to be a good thing in the longer term as it makes organisms more adaptable to changing demands. And most of what happens in current times is classical mutagenesis, rather the utilisation of adaptable gene regulation mechanisms that allow a much more rapid approach. Changes in human height in the last ten thousand years are an expression of such mechanisms for example.

    Shane-I believe a lot of the problems with modern staples arent a reflection of their inherent biochemical compositions or our shared genetic pasts but rather a consequence of the extreme levels of over production, industrial processing and prolonged storage required to make the support of our enormous population possible.

    Jason-Yet it was only with the introduction of these “problems” that our health began to improve. We live almost as long as foragers now, thanks to those “problems.” Those “problems” greatly improved our diet. Heavily processed flour may destroy all the nutrients, but more importantly, it destroys many of the anti-nutrients, like lectin. Such “affluent malnutrition” is the reason we no longer suffer the kind of mortality seen in the Middle Ages. It’s still malnutrition, but at least we get a lot by volume.

    Shane- Causality is still lacking here. Many other things (predominantly how infectious diseases were treated) changed at the same time as the introduction of bleached industrial flour. A good test could be the Amish…they consume plenty of grains but mill it themselves and consume it whole. A 1985 study found they had a significantly lower level of dental problems than the general population (http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=3166065&dopt=Abstract).

    Shane-Problems with cavities related to flour consumption skyrocketed when industrial flour production started.

    Jason-No, they did not. Look at archaeological digs. Few early agriculturalists had any teeth left at all: and they typically died in their 30s.

    Shane- Weston Price’s work doesnt support your proposition that the origins of agriculture are associated with caries and degenerative disease. He points the finger squarely and industrial processing of food.

    Shane-Real flour has a good balance of essential fats and fat soluble vitamins, and isnt ground as finely.

    Jason-No, it doesn’t. It’s actually pretty impoverished in terms of nutritional content. Compare flour to, say, a dandelion. Flour is almost pure sugar, with little protein and even fewer vitamins. To get enough vitamins from flour, you would need to eat almost nothing but flour, all day long. And then, of course, you get an extreme overdose of sugar.

    It’s precisely because flour is so lacking in vitamins that earlier agriculturalists were, as a rule, disease. You see skyrocketing rates of various maladies arising from lack of vitamins among populations that depended on bread as the “staff of life.”

    Shane-A whole wheat grain has all the necessary vitamins and minerals to initiate the growth of a wheat plant. Processed flour has all the living material stripped off, then the remaining starchy endosperm is bleached with chlorine to remove the last traces of anything other than carbohydrate and protein. People do eat flour and other grains most of the day, so the difference between whole flour, with a bit of extra nutrition, and processed flour, with everything stripped out, is significant.

    You have to account for other confounding factors in early agricultural society also. A higher work load would have changed peoples statures (Ive seen pictures of twins where only one did gymnastics and the change in stature is dramatic). A more reliable food source would also lead to earlier sexual maturity and reproduction compared to hunter gatherers, which is another mechanism to decrease body size. Malnutrition is only one mechanism by which people get shorter. Being shorter is an advantage to doing field work agriculture, so it would be hard to be confident there werent general selective pressures against being tall.

    Shane-Carbohydrates arent evil per se- a diet that is too narrow and extreme in any direction comes at a price.

    Jason-I tend to agree with that. A healthy human diet is going to include carbohydrates–but far, far less than we get today. Furthermore, it should mostly come from fruits. Grains, if ever used at all, should probably be confined to emergencies.

    Shane- I tend to agree with you also, and I think the paleolithic diet approach has some merits. But people are so incredibly variable in their responses to diet that epidemiological studies tend to miss all the personal details. I would more generally encourage people to improve the quality of their diets (including trying genuinely unprocessed fresh grains) rather than broadly asserting that all grains are “poisonous”. At least some adaptation to the neolithic agriculturalist diet has taken place in the last ten thousand years. Current research backs up this claim.
    (http://yubanet.com/artman/publish/article_32502.shtml)

    Comment by Shane — 20 March 2006 @ 7:09 PM

  158. “There are two kinds of relationships between organisms where one eats the other. In the first the victim doesnt benefit from it and has every incentive to make life as difficult as possible for the other, hence the usual arms race of defense and counter defense.”

    In a hunter-gatherer way of looking at the world, there is virtually no relationship in the natural world where both parties don’t benefit. Prey are not considered to be “victims” of predators. In the words of an Inuit elder, “The Wolf makes the Caribou strong”. In fact, a native hunter-gatherer would likely not see Wolves as being essentially separate from Caribou — they are more like light is to dark or yin is to yang than like two distinct entities — the wolf and the caribou are one, just as the people and the caribou (that feed them as well) are one. In other words, the “arms races” of nature (unlike the arms races of civilization) actually benefit both parties.

    A good example of this that has recently been discovered by ecologists is the traditional relationship between Wolves, Buffalo, and Prarie Grasses on the North American Plains. The Wolves cause the Buffalo to move in such a way that they plow the soil to make it more suitable for the Grass. Without the Wolf, the Buffalo do not move this way, the soil becomes compacted and the Grass does not flourish to the same degree. In other words the Wolf benefits the Grass which Benefits the Buffalo which Benefits the Wolf.

    In most native legends, predator and prey worked out their relationship in a mutually agreeable and advantageous way long, long ago. This mirrors our culture’s creation-legend (evolution) where predator and prey co-evolved together over countless eons to become who they are, and embody the relationship they have today.

    Comment by RedWolfReturns — 20 March 2006 @ 11:33 PM

  159. Havent modern humans and their domesticates been going through this process of negotiating mutual benefit for the last ten thousand years? Our crops and animals have changed quite dramatically from their wild ancestors, and have every incentive to not be detrimental to our health. We have changed to- evolution is a very rapid process under the right selective pressures. I simply dont buy this assertion that nothing has changed in the human genome or more importantly our gene regulation in the last ~40 000 generations. Mutualism is important in nature, but the bottom line is that the parties engage in it for their own long term benefit.

    Comment by Shane — 21 March 2006 @ 1:26 AM

  160. There are two kinds of relationships between organisms where one eats the other.

    Far more than two, and your scenario is grossly oversimplified. No, fruit alone is not a perfect food for humans, because humans aren’t frugivores. We aren’t even herbivores. We are omnivores. Prosimians were frugivorous, and so, fruit still plays a significant part in a healthy human diet, but because our more recent adaptations have mainly played to meat, fruit has taken a back seat to meat-eating as a source of nutrition. As omnivores, a healthy human diet relies on a proper balance of food sources; mostly animal, with some significant input from wild edibles, fruits and nuts.

    It seems to me you are arguing in opposite directions here. The previous point was that natural co-evolution leads to mutually beneficial synergies, yet here you argue that agricultural co-evolution of man and crops always leads to residually toxic relationships….clarification perhaps?

    Certainly; you need to understand the timescale for such adaptations to take place. Animals rarely turn their entire diet on its head. You don’t see lions trying to live solely off of apples, for instance. To my knowledge, this has only happened, as far as we know, once: with our own Agricultural Revolution, when a bunch of omnivores decided to suddenly adopt a completely different strategy, and become granivores. There was a granivore among hominids, once upon a time: the Paranthropus genus, a.k.a., “robust australopithecines.” But they died out–granivorous hominids didn’t work out–and they aren’t related to us in any way.

    So, usually, we’re talking about a small change in diet. Nut-eating squirrels start eating more acorns than other nuts, for instance. Not a huge change, and not something that’s going to have any kind of significant health effects for the squirrel. It’s imperfect, though, so over a few hundred thousand years, the squirrels begin to develop adaptations to handle acorns better. At the same time, the acorns adapt to become better food for squirrels. After a few hundred thousand years, the two have achieved a level of symbiosis.

    When we look at humans, we see two immediate, glaring differences. First, the change was immense, from omnivorous scavengers to granivores. That entails significant health effects for the animal. Second, the timescale is much, much shorter–just 10,000 years. With such a radical change, the timeline becomes longer, but even a modest change requires a few hundred thousand years. For something so radical, humans will need to keep plugging away at this for something like half a million years before we can eat grains safely. Needless to say, that’s not something that’s going to happen in our lifetimes.

    Overspecialisation is death also..being a perfect leaf eater closes the evolutionary door to going back to other modes of living.

    That’s why evolution is a process of random experimentation. It doesn’t deal in perfection; it deals in optimization.

    Causality is still lacking here.

    Not in the least. Dental problems are almost always the result of the acids created by the reaction of amylase in the mouth with carbohydrates, like grains. The lectins in grains provide a mechanism for any number of health effects. In the original article, the causality whereby the consumption of grains leads to these various effects was discussed at length.

    Weston Price’s work doesnt support your proposition that the origins of agriculture are associated with caries and degenerative disease. He points the finger squarely and industrial processing of food.

    He does, but his work is … shall we say, somewhat skewed.

    A whole wheat grain has all the necessary vitamins and minerals to initiate the growth of a wheat plant.

    You will note that humans are not wheat plants.

    You have to account for other confounding factors in early agricultural society also. A higher work load would have changed peoples statures (Ive seen pictures of twins where only one did gymnastics and the change in stature is dramatic). A more reliable food source would also lead to earlier sexual maturity and reproduction compared to hunter gatherers, which is another mechanism to decrease body size. Malnutrition is only one mechanism by which people get shorter. Being shorter is an advantage to doing field work agriculture, so it would be hard to be confident there werent general selective pressures against being tall.

    Height is, in almost all cases, a sign of malnutrition. The stereotype of Japanese as short arose from the time of Commodore Perry’s contact, when Japan was feudal. Industrialization has raised Japanese statures to that comparable to anywhere else in the world.

    But these factors are found equally among the ruling class, as well, who were protected from the various other confounding factors you raise. They played their role, but there’s no doubt that reliance on grains is going to lead to severe health consequences for a species adapted to meat, wild edibles, fruit and nuts.

    …rather than broadly asserting that all grains are “poisonous”.

    You’re correct that there’s a great deal of personal variation, but some things fall well outside the range of variation. Nightshade and water hemlock are not good to eat, no matter who you are. No matter who you are, grains are poisonous–they contain neurotoxins and lectins. Period; end of story. How much meat versus how much fruit, etc., that’s a matter of each individual’s own body, but there is no human being for whom grain is healthy.

    At least some adaptation to the neolithic agriculturalist diet has taken place in the last ten thousand years.

    Some, yes. Not nearly enough. We still have no lectin-blocking enzymes, for instance. Adaptations are beginning, but it will be at least a hundred thousand years before any of us can say that it’s healthy for any human, anywhere, to eat grains. The adaptations have begun, but there’s far too much to adapt to, and we won’t adapt to it for a very long time to come.

    Havent modern humans and their domesticates been going through this process of negotiating mutual benefit for the last ten thousand years?

    They have–but it’s a process that takes a few hundred thousand years for a modest change, and even longer for the kind of radical change like domestication. In other words, it’s begun, but it’s nowhere near complete. Maybe 1% adapted?

    I simply dont buy this assertion that nothing has changed in the human genome or more importantly our gene regulation in the last ~40 000 generations.

    No one ever said anything of the sort. I said that the adaptations necessary take far longer than we’ve so far had, that we’ve started to adapt, but that there’s much more to adapt to before it’s a good idea. If lions decided to try living solely off of apples, it would be a long, long time before it became healthy for them. Likewise, our Neolithic diet is too radical a change to expect to not slowly kill us after a mere 10,000 years.

    Comment by Jason Godesky — 21 March 2006 @ 10:57 AM

  161. “Havent modern humans and their domesticates been going through this process of negotiating mutual benefit for the last ten thousand years?”

    While 10,000 years is a very small amount of time relative to evolutionary scales, only a small percentage of modern humans have been going through this process for even that long. If you can trace your genetic lineage directly back to that first tiny civilization in the fertile crescent, then you’ve had ten thousand years. Most of us have had far fewer generations to adapt.

    My own northern Germanic heritage would give my lineage about 5000 years to adapt to grains, while one of our neighboring Ojibwe people here in the Northwoods of Wisconsin would have had around 200 years for his lineage to adapt.

    We should keep in mind that less than 400 years ago more than half of the surface of the earth was still populated by genetically “modern” humans who lived in fully viable hunter-gatherer cultures.

    Comment by RedWolfReturns — 21 March 2006 @ 6:30 PM

  162. The 10000 years of coevolution applies much more to the crops you accuse of being so deleterious-shorter generations and larger populations to select from. Crops are very different to paleolithic wild plants and have experienced enormous anthropic selective pressures. You seem to be subscribing to classical darwinian evolution where tiny changes randomly accumulate in a population, where we now know that evolution can and does happen very rapidly, especially in plants where hybridisation of existing species (as happened in wheat) leads to a hyperdiverse hybrid swarm to select from. Under the right circumstances radical evolution can easily occur within this period.

    If the relict protective co-factors present in these plants are so deleterious to human health then why havent we bred them out of the strains we consume? Recent breeding has shown that soybeans can be easily developed without the usual complement of trypsin inhibitors, and these strains induce milder growth retardation effects when overfed as raw soybean meal to rats.

    Notice I said raw meal too. A key point is that humans have adapted behaviourally in how they consume these products. Soybeans and lima beans are toxic raw, no questions there, but the cultures that developed them engage in careful processing to make them edible. Many hunter gatherer tribes also consume staples that require careful detoxification…it isnt purely a trait of agriculturalists. Even then the chinese have always known that a person shouldnt eat too much tofu. And now westerners come along and put soy into every processed food they can find and spin it as a healthy eastern practice…..”phytoestrogen” this and “japanese women dont get osteoporosis” that. The human body does have a wide variety of endogenous detoxification mechanisms, but eating too much of one food can lead to them being overwhelmed, so in general I agree that the modern industrial diet is imbalanced and dangerous. I dont agree that agricultural diets are inherently bad though. Modern affluent malnutrition (which I agree exists) has probably weakened our detoxification mechanisms to the point where people are showing sensitivity to foods that they used to tolerate.

    Your much quoted Dandelions have just as much incentive to protect themselves chemically…if I recall correctly the latex from them is a most effective wart remover. Hardly what you would call an ideal food. Wild nuts likewise are usually well armed with tannins or cyanide. Meat eating carries its own risks of parasite transfers. Im simply not convinced by this garden of eden picture you paint of wild plants and animals just waiting to nourish us. Toxicity of any plant toward humans is usually incidental at any rate….their main concern is the effect of insects upon their reproductive success, and this pressure probably accounts for the persistance of defense compounds in agricultural crops. Sunflower seeds for example contain the most potent trypsin inhibitor known to man (I did my PhD in the lab that discovered it)…..yet for some reason raw sunflower seeds dont seem to be at all toxic to mammals.

    From talking to dentists I got the impression that it is the texture of foods which is far more critical for their contribution to caries than the acidity or sugar content. A glass of orange juice passes through the mouth in seconds and the mouth quickly readjusts its pH. Wheat is especially bad when processed into a fine flour because it sticks to the teeth, creating an environment where acid can build up. Oddly enough I have noticed whole freshly ground flour doesnt have the same gluey texture as industrial flour…the extra protein and fat and coarser grind makes a difference. Small differences in demineralisation add up over a lifetime to massive differences in dental health. Regular disturbance of the mouth flora with antibiotics probably plays a role too..only one species of bacteria in the mouth among thousands causes cavities. The acid from amylase is small change by comparison. Saliva is alkaline and designed to protect the teeth….problems mostly occur when its access is cut off. General malnutrition leads to a poorer starting point though in terms of enamel quality. And contamination of grain with grit also accelerates tooth wearing, both of which were probably a factor in the dental decline during early agriculturalism.

    No one seems to have tackled the theory that more reliable food sources for agriculturalists led to earlier sexual maturity and earlier reproduction, leading to a decrease in body size (which would have applied to peasants and nobles alike). This fits with the recent reversal of the trend given that reproduction is now greatly delayed in first world societies. Girls typically were pregnant by age 15 in the middle ages. Larger mothers have larger offspring and vice versa, and regular pregnancy tends to divert resources from the growth of the mother. This kind of epigenetic change happens very rapidly in response to the environment. The assertion that being six foot tall is “normal” makes no real sense- all animals have well worn mechanisms for adjusting their body size to suit the environment.

    Also how reliable are any impressions of paleolithic health issues? I know you can tell a fair bit from the state of skeletons (height, injuries, skeletal defects pointing to disease or some forms of malnutrition). But does this really give us any indication of the incidence and consequences of lower level chronic diseases? Perhaps chronic diseases became noticeable in increasingly stable societies as their consequences werent so immediately fatal? Im not convinced you can get a clear enough picture of population dynamics and the impact of seasonality on hunter-gatherer populations to make any solid arguments about paleolithic lifestyles. Sure you can see healthy older individuals, but it doesnt really tell you what happened to everyone else along the way.

    You also seem to gloss over the fact that agriculture was developed gradually and that most agricultural societies up until recently still did a reasonable amount of foraging and hunting, especially if their crops failed. It wasnt a sudden or complete change in lifestyle.

    I hope you guys arent getting the impression that I fundamentally disagree with your arguments….I just think you need to be a bit more careful in your assertions. Agriculture did happen, so there must have been a collective benefit. Worker honeybees apparently live gruelling and pointless lives, yet they exist. Its easier to dismiss agriculture than to truly understand it.

    Comment by Shane — 21 March 2006 @ 8:10 PM

  163. Finally found someone citing real data on stature and longevity from the paleolithic to modern times

    http://www.beyondveg.com/nicholson-w/angel-1984/angel-1984-1a.shtml

    So there indeed were minor decreases in stature and longevity during the transition to agriculture, but perhaps not quite as dramatic as the impression I have gathered from this site. Increases in disease are argued as being equally likely causes for the change as the stress of antinutrients in their grain based diets.

    Comment by Shane — 22 March 2006 @ 1:54 AM

  164. Hey Shane –

    Did yu read the article or did you just look at the chart?

    From your link (emphaisis mine):

    …farming was hard work, and skeletal evidence shows signs of the heavy effort needed, which–combined with a diet adequate in calories but barely or less than adequate in minerals from the depleting effects of phytate (phytates in grains bind minerals and inhibit absorption)–led to a state of low general health. The considerable decrease in stature at this time (roughly 4-6 inches, or 12-16 cm, shorter than in pre-agricultural times) is believed to have resulted from restricted blood calcium and/or vitamin D, plus insufficient essential amino acid levels, the latter resulting from the large fall in meat consumption at this time (as determined by strontium/calcium ratios in human bone remains).

    So, just as Jason has said, Agriculturalists suffer from affluent malnutrition especially in the modern west… we get plenty of calories, and perhaps even our oldest farming ancestors did as well, but the balance of nutrients is ALL off, both from decreases in meat intake, and, more importantly, from the anti-nutrients in grains that prevent mineral absorbtion

    Also, note your link also describes, although not well, that H-G populations have fairly high infant-euthanasia levels… it has been stated before that if we were to look at populations and start at age three, rather than birth H-G populations have a MUCH longer median life span than farmers.

    Janene

    Comment by Janene — 22 March 2006 @ 9:08 AM

  165. You seem to be subscribing to classical darwinian evolution where tiny changes randomly accumulate in a population, where we now know that evolution can and does happen very rapidly, especially in plants where hybridisation of existing species (as happened in wheat) leads to a hyperdiverse hybrid swarm to select from.

    You don’t seem to understand punctuated equilibrium, though. Punctuated equilibrium rests on our understanding of large amounts of unexpressed genes, and single “trigger” genes that turn those complexes on and off, allowing massive changes in short order. But it does require that the genes already be in play. This is how squirrels can “switch over” to acorns in a single generation. I don’t doubt that punctuated equilibrium happens–but I do doubt that it can suddenly “switch over” to a complex of genes that are not anywhere present in the genome. Wheat has never been food for any kind of ape; there is no ape-friendly genome to switch over to.

    Under the right circumstances radical evolution can easily occur within this period.

    Again, you misunderstand punctuated equilibrium. It does happen very quickly, where “very quickly” means only a few hundred thousand years. On the time scale of evolution and geology, that’s the blink of an eye, an incredibly fast transformation.

    If the relict protective co-factors present in these plants are so deleterious to human health then why havent we bred them out of the strains we consume?

    Because we can’t. We’ve tried, and domestication can definitely change plants quite a bit. Wild almonds will kill you immediately–they have about as much cyanide as a cyanide capsule. In domestication, we were able to breed for those mutants that lacked the cyanide. We’ve bred for non-bursting pea pods. All of these were significant mutations from the original species, but still identifiably the same species.

    To breed wheat that has no lectins or neurotoxins, but instead has vitamins and minerals, we would need to breed a kind of wheat that is no longer wheat. These are not things that are superfluous to the plant, like cyanide in almonds; these are things that define the plant itself. To breed out these factors, we would need to turn wheat into not-wheat. Domestication can shape plants into things better suited to our digestion, but it cannot turn carrots into peas, almonds into cabbages, or wheat into t-bone steak. At the end of the process, the domesticated variety is still the same plant, and there’s just no way to make wheat a good food for humans.

    Recent breeding has shown that soybeans can be easily developed without the usual complement of trypsin inhibitors, and these strains induce milder growth retardation effects when overfed as raw soybean meal to rats.

    Yes, which fits into the same rubric as breeding almonds without cyanide–but is quite different from the problem of wheat with no lectins or neurotoxins.

    A key point is that humans have adapted behaviourally in how they consume these products. Soybeans and lima beans are toxic raw, no questions there, but the cultures that developed them engage in careful processing to make them edible.

    Indeed, many foods fit into that category. Of course, no cultural method can ever remove all the harmful elements, so those things do poison you mildly ever time you eat them. In moderation, they should be fine. Eat them all the time, and you’ll be wondering why your health is failing, and why your body is falling apart as you stumble into an early grave.

    Cooking wheat kills enough of the toxins in it that it will no longer kill you immediately. Instead, it will only poison you. A diet based on cooked wheat will only kill you slowly, rather than quickly (the way uncooked wheat does). That’s why I think refined wheat is probably better than whole grains–sure, it’s more pure sugar, and more of a shock of insulin, and it has even less nutritional value (which wasn’t much to begin with), but you’ve also broken down more of the toxins, of which there are far more. In general, cooking will help break down potential poisons in food, at the cost of nutrition. Often, this is a worthwhile exchange. Some foods, however, are so toxic, and contain so little nutrition, that it’s never worhtwhile. Water hemlock, for instance. Wheat can be made palatable with proper preparation, but it can never be made healthy.

    Many hunter gatherer tribes also consume staples that require careful detoxification…it isnt purely a trait of agriculturalists.

    I know of hunter-gatherers who do such things, but none that rely on them as staples. As I said, it’s generally safe in moderation, and with some foods more than others. Wheat is a particular plant that can be made edible, but it can never be made healthy. The only non-meat staple I know of among any hunter-gatherer group is the !Kung’s obsession with mongongo nuts–and I understand those are edible raw. Could you name one of these many hunter-gatherers?

    Your much quoted Dandelions have just as much incentive to protect themselves chemically…if I recall correctly the latex from them is a most effective wart remover. Hardly what you would call an ideal food.

    That doesn’t follow in the least. Dandelion sap is supposed to remove warts, but it doesn’t follow that it must be toxic. In fact, it’s not–not in the least. They may have an evolutionary incentive–or not–but if they do, it hasn’t led to anything yet, because there’s nothing in dandelions that is poisonous to humans. Dandelion leaves have more vitamin C than a glass of orange juice; they are a rich source of iron and vitamin A; ground dandelion root is a good coffee substitute that’s actually quite good for your liver; medicinally, it is has antibiotic and antiviral uses. It’s one of the most powerful, all-purpose medicines in the herbalist repertoire, earning it the name Taraxacum officinale, and is probably the single most nutritious food on the planet.

    Compare this to wheat, which has all kinds of neurotoxins and lectins, almost no vitamins or minerals to speak of, and little that is digestible for humans besides a cheap sugar rush. Dandelion has no toxins; wheat is full of them. Dandelion is bursting with vitamins and minerals; wheat has almost none.

    Wild nuts likewise are usually well armed with tannins or cyanide.

    Depends on the nut. Hunter-gatherers tend to avoid poisonous plants. Yes, hunter-gatherers ate nuts, and some nuts are poisonous. Some nuts are only mildly so, and sometimes, hunter-gatherers processed them to eat them. Other nuts were too poisonous, and were never eaten. Yet even these nuts are not as toxic as wheat.

    Meat eating carries its own risks of parasite transfers.

    We have defenses against parasite transfers, the same way birds have enzymes that block lectins. You will not that we do not have lectin blockers.

    Im simply not convinced by this garden of eden picture you paint of wild plants and animals just waiting to nourish us.

    I never said anything of the sort. I said that there’s a diet humans have adapted to. Most of the time, that adaptation is two way–our food adapts to us, and we adapt to our food. Birds develop enzyme blockers to safely eat grains; we develop mechanisms to protect us from parasite transfers through meat. There’s no “Garden of Eden” here, just a basic understanding of adaptation and co-evolution.

    If you take an animal that’s adapted to a diet of meat and wild edibles, and suddenly make him eat grains, it’s going to take a while before that animal has adapted to eating wheat–a lot more than just 10,000 years. If we keep at it, eventually, it will be safe. But not today, and not for quite a bit more time–at least a few dozen more millennia. None of us will live to see the day when grain is healthy.

    Toxicity of any plant toward humans is usually incidental at any rate….their main concern is the effect of insects upon their reproductive success, and this pressure probably accounts for the persistance of defense compounds in agricultural crops.

    That is a gross oversimplification. Plants develop relationships with animals, and not just insects by any stretch of the imagination. Toxins are developed to keep away the animal’s rivals, which may not benefit that plant. Insects are not even close to being the only possble avenue for a plant’s reproductive success. Seeds can be spread in animal dung just as easily as through insect pollination.

    In the case of agricultural crops, specifically wheat, the continuation of its defenses has nothing to do with insect pollination. For wheat, the food is the seed itself. It gains nothing from animals eating its seeds, and wants to keep all animals from eating its seed. You’ll note, most seeds are toxic. Wheat packs its seeds full of poisons to keep animals from eating them. Some animals develop ways of getting around those poisons–like a bird’s lectin blockers–but we haven’t.

    From talking to dentists I got the impression that it is the texture of foods which is far more critical for their contribution to caries than the acidity or sugar content.

    Again, you’re misquoting me. I didn’t say anything about the acidity of sugar content. I said that when sugar comes into contact with amylase, it forms an acid that causes tooth decay.

    No one seems to have tackled the theory that more reliable food sources for agriculturalists led to earlier sexual maturity and earlier reproduction, leading to a decrease in body size (which would have applied to peasants and nobles alike).

    That’s because, as we’ve discussed elsewhere ad nauseum, agriculturalists do not have more reliable food sources. Their food sources are much less reliable. Only agriculturalists suffer famine.

    Girls typically were pregnant by age 15 in the middle ages. Larger mothers have larger offspring and vice versa, and regular pregnancy tends to divert resources from the growth of the mother.

    That had nothing to do with the onset of physical maturity, but with social needs. The death rate was catastrophically high from such an unhealthy lifestyle. Everyone who could had to breed, as early and as often as possible, in order to keep th society from dying off, to balance such a high death rate. The Roman Empire passed laws against bachelors. Celibacy was seen as a crime against the human race itself.

    The assertion that being six foot tall is “normal” makes no real sense- all animals have well worn mechanisms for adjusting their body size to suit the environment.

    Bullocks. Healthy, adequately fed humans have a given height. If they’re sick or malnourished, they’re shorter.

    Also how reliable are any impressions of paleolithic health issues? I know you can tell a fair bit from the state of skeletons (height, injuries, skeletal defects pointing to disease or some forms of malnutrition). But does this really give us any indication of the incidence and consequences of lower level chronic diseases?

    Yes. Many diseases–especially diseases caused by malnutrition–leave skeletal evidence.

    Im not convinced you can get a clear enough picture of population dynamics and the impact of seasonality on hunter-gatherer populations to make any solid arguments about paleolithic lifestyles. Sure you can see healthy older individuals, but it doesnt really tell you what happened to everyone else along the way.

    Then you just need to pick up a book on archaeological methodology. It’s quite sound.

    You also seem to gloss over the fact that agriculture was developed gradually and that most agricultural societies up until recently still did a reasonable amount of foraging and hunting, especially if their crops failed. It wasnt a sudden or complete change in lifestyle.

    Because it isn’t true. Agriculture was picked up quite suddenly, on an evolutionary timescale–only a few thousand years to go from foragers to farmers. Supplementals don’t matter much when we’re talking about a society’s primary sources of nutrition.

    Agriculture did happen, so there must have been a collective benefit.

    Bullocks. It has to have some benefit, but not necessarily a collective benefit–not if those it benefits have the ability to whip everyone else along. The Agricultural Revolution was led by and for the elites, as we discussed in thesis #10. The only benefits of agriculture are for the elites, and it was the elites who were in a position to bring it about. Agriculture has no collective benefit, but it does benefit the elites. It helps to create the elites, it justifies their existence, and it provides them with a basis of power. The process by which the Agricultural Revolution occurred has been discussed here already at great length.

    Finally found someone citing real data on stature and longevity from the paleolithic to modern times

    Oh, you found thesis #25, where I discussed those statistics in depth, with a long methodological discussion and footnotes and references out the wazoo?

    Wait … no … this is one of the papers that I already debunked in that thesis!

    So there indeed were minor decreases in stature and longevity during the transition to agriculture, but perhaps not quite as dramatic as the impression I have gathered from this site.

    I can see how you would get that from the paper’s statements about “the considerable decrease in stature at this time.” Or this little gem, further down:

    The other pressure limiting stature and probably also fertility in early and developing farming times was deficiency of protein and of iron and zinc from ingestion of too much phytic acid [e.g., from grains] in the diet. In addition, new diseases including epidemics emerged as population increased, indicated by an increase of enamel arrest lines in Middle Bronze Age samples.

    The Neolithic Mortality Crisis is one of the most well-established facts of the archaeological record. When populations pick up agriculture, their statures drop by a good foot or more, and their life expectancy is generally halved–within a generation. Read “Disease and Death at Dickson’s Mounds.”

    Increases in disease are argued as being equally likely causes for the change as the stress of antinutrients in their grain based diets.

    GAH!!!!

    OK, hint: well-nourished people tend to get sick a lot less often than malnourished people.

    Comment by Jason Godesky — 22 March 2006 @ 11:49 AM

  166. Jason, I love you, but just because you recently saw V for Vendetta, that does not give you the right to use the word “bullocks.”

    Comment by Giulianna Lamanna — 22 March 2006 @ 12:19 PM

  167. Bullocks!

    Comment by Jason Godesky — 22 March 2006 @ 12:45 PM

  168. Jason,

    Just FYI (since you’re arguing about staples & toxins), the locals here ate acorns as a staple and they had to leach out the tannins before they ate them.

    Best

    Bill Maxwell

    Comment by Bill Maxwell — 22 March 2006 @ 12:52 PM

  169. As a staple? I know they ate a good amount of acorns, but the prescence of acorn with food was the determining factor that divided a “meal” from a “snack”? Acorns were consumed every day, and made up the bulk of their nutritional sources?

    Comment by Jason Godesky — 22 March 2006 @ 12:58 PM

  170. Jason, Giulianna et al,

    Though it warms the cockles of my heart to see Americans adopting Anglicisms, please remember:

    Bullocks = Castrated male cattle

    Bollocks = What they don’t have

    The latter is the one you will be wanting to use in refuting arguements. The former is inappropriate for this type of usage.

    If you could somehow incorporate the word “bollocks” into the book, Jason, you would be doing the English language a great favour. Where I come from it is a sign of scholarly accomplishment and good breeding.

    More or less.

    Clive

    Comment by Clive — 22 March 2006 @ 1:20 PM

  171. Perhaps folks here have already touched on this, I’m not sure. But it’s good to keep in mind that the concept of “staple foods” is more of an aspect of the agricultural mindset than of hunter-gatherer consiousness. Your average agriculturalist diet includes one or two staples and then maybe a few dozen other commonly eaten plant and animal foods. Your average hunter-gatherer would not likely have any “staple foods” and would have hundreds (usually between 100 and 300) different species of plant foods to draw upon seasonaly throughout the year. Add to this nearly every single mammal, bird and reptile in their bioregion for meat and a whole bunch of insects and mushrooms as well.

    This understanding of food source diversity must be included in any discussion on agricultural v.s. hunter-gatherer nutrition or the issue of comparing the reliability of agricultural food sources to hunter-gatherer ones.

    If you want to pinpoint the reason people adopted agriculture (and there certainly was a good reason) you have to look elsewhere than nutrition or food-source reliability.

    Comment by RedWolfReturns — 22 March 2006 @ 1:44 PM

  172. None of your arguments on evolution exclude the possible loss of function of the gene pathways in crops that lead to the formation of lectins or phytate. Turning a gene off is the simplest thing for evolution to do. There is no fundamental difference between the genetic pathways leading to cyanide in almonds and the pathways leading to phytate or lectin in wheat. If anything the loss of cyanide in almonds is less likely given their much longer generation time.

    Humans and their crops have already reached an adequate relationship (even though this may lead to some disadvantages to the individual compared to the former hunter-gatherer lifestyle) through changes in the morphology and biochemistry of the crop and in the behaviour of the humans. Human evolution of new phytases isnt the only possible solution to the problem.

    Evolution of plants through hybridisation to form new species happens much faster than on the hundreds of thousands of years timescale. The Lantana weed growing here through much of eastern Australia is a hybrid of three or more original south american species and arose in the last hundred years. Countless other examples are documented. Even insects have recently been observed to hybridise to form a new species that utilised a different food source than either parent. Its an interesting example as the fly larvae would have to deal with a completely new set of seed defense toxins, including lectins, during the transition from the relatively unrelated blueberry to honeysuckle hosts.

    http://www.biologynews.net/archives/2005/07/28/invasive_honeysuckle_opens_door_for_new_hybrid_insect_species.html

    I still contend that all food is a “poison” to some degree, but so long as the net benefits outweigh the net costs then it is a viable food. There are countless examples in nature of very highly adapted species feeding on toxic plants yet still suffering observable detrimental physiological effects. There is nothing “unnatural” about this kind of lifestyle….the organisms are just occupying a viable niche that comes with a price tag. Humans created the agriculturalist niche when it became viable compared to the hunter/gatherer option (probably due to excess hunting).

    Im still yet to see any evidence of people dying quickly from eating uncooked wheat. There are plenty of clinical examples of people becoming ill from eating uncooked legumes. Am I missing something here? If the data is out there I would love to see it. ChandraShakti brought this up in light of your claims of the acute toxicity of raw wheat but it hasnt been directly addressed yet.

    The Chamorros of Guam consume cycad flour as a staple after soaking to remove large quantities of cyanide. There is also a residual unnatural amino acid present that also causes an increased rate of neurological degradation later in life. Australian aborigines also consumed a fair amount of cycad flour in season. Most of the many seeds they ate needed days of soaking, pounding and roasting before being made edible. In central Australia Aboriginals typically got 50-80% of the calorific intake from seeds (both grasses and legumes).

    http://www.hort.purdue.edu/newcrop/proceedings1996/v3-228.html

    This brings up an important point. Hunter gatherers typically have very seasonal diets, so escape chronic exposure to any one class of nutritional toxins. Agriculture is characterised by the large scale storage of food, especially grains. Perhaps the real problem isnt wheat per se but the fact we consume it every day? Ironically this is the main advantage of agriculture. Storable excess meant people didnt instantly starve when they had a poor season, as you have been claiming. They typically had grain stores from last year to tide them over in bad seasons.

    Looking up Mongongo nuts it appears they are roasted before eating. They come from the Euphorbiaceae, a family well known for its toxic irritating latex and one of the most potent allergenic lectins, hevein.

    Dandelion latex kills skin cells on contact, but it doesnt follow that it is toxic? This isnt an antiviral effect…I have used it myself and a big chunk of previously healthy skin comes out around the wart. Here is a nutritional breakdown of Dandelion leaf

    http://www.nutritiondata.com/facts-B00001-01c20dN.html

    Sure a cup is healthy but contains basically no energy (protein, fat or carbs), and much more than about six cups a day and you would start risking hypervitaminosis A.

    Contrast this with whole wheat

    http://www.nutritiondata.com/facts-B00001-01c21UZ.html

    And you see one cup gives you half a days energy requirement and some protein. Surely you could eat wheat one day to get your energy requirements, then eat dandelions the next day to get your nutrients (highly oversimplified of course). Or does eating wheat actually suck nutrients out of your body? I got the impression phytates are already bound to calcium and iron in the wheat.

    Any source of concentrated energy in nature is going to be protected to some degree. Being selective about what we eat and how we preprocess it can make a food source worthwhile on balance. Might it make more sense to educate people about the drawbacks of grains and legumes in order to encourage them to simultaneously diversify and reduce intake and ensure they are properly processed than to retreat to the stone age?

    Insects and invertebrates are the dominant force in terrestrial habitats. The vast majority of animal biomass is insect, and as such most of the adaptation of any plant is geared much more toward insect pressures than vertebrate or human pressures. Every vertebrate could disappear today and the earths ecosystems would continue on without significant change. As such the health effects of plant defense molecules on humans are more or less incidental. I was implying here that the persistance of defense molecules in our crops is probably due to need to protect themselves from insects being more critical than the selective pressure to nourish humans. In other words I was agreeing with your side of the argument that there may be practical reasons why crops cant become completely defense molecule free.

    Jason-Some animals develop ways of getting around those poisons–like a bird’s lectin blockers–but we haven’t.

    Actually we have. We variously grind, soak, germinate, roast and bake grains to reduce the amount of defense compounds. It may or may not be adequate, but a birds digestive enzymes face comparable practical limitations.

    Shane-From talking to dentists I got the impression that it is the texture of foods which is far more critical for their contribution to caries than the acidity or sugar content.

    Jason-Again, you’re misquoting me. I didn’t say anything about the acidity of sugar content. I said that when sugar comes into contact with amylase, it forms an acid that causes tooth decay.

    Shane- Actually you are misquoting me here. I said acidity OR sugar content. Otherwise your second and third sentences clearly contradict each other. Caries are well established to be caused by specific strains of bacteria that break down sugars to produce a localised acid reaction. I have gotten the impression from dentists that gluey foods like wheat flour greatly accellerate this process. You didnt directly address this point as it relates to modern industrial flour. General erosion of enamel across the whole mouth does occur from the excess consumption of highly acidic, generally processed, foodstuffs. But the direct reaction of human amylase with starches is not a significant cause of caries or general enamel wear. Minor clarification.

    Jason-That’s because, as we’ve discussed elsewhere ad nauseum, agriculturalists do not have more reliable food sources. Their food sources are much less reliable. Only agriculturalists suffer famine.”

    Shane-Storage of food is a simple and reliable mechanism to buffer the effect of poor seasons, and is easier to achieve in a fixed agricultural society with grain based staples. The seasonality of hunter-gatherer diets is usually linked to regular cycles of varying food availability and quality.

    http://jap.physiology.org/cgi/content/short/96/1/3

    In fact the absence of regular periods of fasting and inactivity may be another contributing cause to chronic diseases in societies with a continuous food supply. Hunter-gatherers surely do suffer periods of inadequate food, but their smaller populations and lack of written records make them less obvious. Lean seasons are associated with reduced work load and reduced energetic requirements, unlike agriculturalists that still need to do some work. Agriculturalists also use the same famine foods (eg stone pine seeds in the mediterranean, or tree bark and weeds recently in North Korean) in lean times that hunter gatherers always did. They dont just sit around pawing at the dust waiting to die, but their greater numbers make these mechanisms less effective.

    Shane-Girls typically were pregnant by age 15 in the middle ages. Larger mothers have larger offspring and vice versa, and regular pregnancy tends to divert resources from the growth of the mother.

    Jason-That had nothing to do with the onset of physical maturity, but with social needs. The death rate was catastrophically high from such an unhealthy lifestyle. Everyone who could had to breed, as early and as often as possible, in order to keep th society from dying off, to balance such a high death rate. The Roman Empire passed laws against bachelors. Celibacy was seen as a crime against the human race itself.

    Shane-The main point I am trying to make here is that nutritional status greatly influences age of menarche. Both medieval and hunter gatherer societies would have wasted no time in allowing mature females to reproduce. I was pondering if females in an agricultural society on a energetically constant but nutritionally impoverished diet may have experienced earlier sexual maturation than hunter gatherers on an energetically sporadic but high nutrient diet. I thought I remembered seeing reports of indigenous girls having fairly late menarche, but fishing for hard data seems to indicate paleolithic menarche was similar to that induced by modern “affluent malnutrition”. So argument dropped there……though to clarify I was never asserting I was right, merely exploring alternate explanations for changes in height.

    http://www.sciencedaily.com/releases/2005/12/051201022811.htm

    Shane-The assertion that being six foot tall is “normal” makes no real sense- all animals have well worn mechanisms for adjusting their body size to suit the environment.

    Jason-Bullocks. Healthy, adequately fed humans have a given height. If they’re sick or malnourished, they’re shorter.

    Shane- If only it were so simple. African Pygmies grow like normal humans but have no pubertal growth spurt, which apparently isnt a result of malnutrition. If Pygmies can evolve a different “normal” height within the span of human history then why cant any other group? A high fertility/high mortality early agricultural society could produce sufficient selective pressure to change height in the absence of chronic malnutrition.

    Shane-You also seem to gloss over the fact that agriculture was developed gradually and that most agricultural societies up until recently still did a reasonable amount of foraging and hunting, especially if their crops failed. It wasnt a sudden or complete change in lifestyle.

    Jason-Because it isn’t true. Agriculture was picked up quite suddenly, on an evolutionary timescale–only a few thousand years to go from foragers to farmers. Supplementals don’t matter much when we’re talking about a society’s primary sources of nutrition.

    Shane-The lines between hunter-gatherer and agriculturalist are not clear cut. “Supplementals” do matter when you consider such inputs as wild seafood into the nutrition of a nation (eg japan), especially for trace elements and essential fatty acids (though world fish stocks have recently collapsed, leading to “aquaculture”, perhaps serving a relevant example of how increasing pressures on a food resource can make cultivation gradually appear economical even as apparent quality decreases).

    Shane-Agriculture did happen, so there must have been a collective benefit.

    Jason-Bullocks. It has to have some benefit, but not necessarily a collective benefit–not if those it benefits have the ability to whip everyone else along. The Agricultural Revolution was led by and for the elites, as we discussed in thesis #10. The only benefits of agriculture are for the elites, and it was the elites who were in a position to bring it about. Agriculture has no collective benefit, but it does benefit the elites. It helps to create the elites, it justifies their existence, and it provides them with a basis of power. The process by which the Agricultural Revolution occurred has been discussed here already at great length.

    Shane-Elites are a product of the centralisation of power, typically through the centralisation of the production or more usually storage of food. Agriculture favours this, but the formation of heirarchical societies is known in hunter gatherers where rich, reliable sources of food were available. The Haida are an example, even to the point of relying on enslaving people from other tribes (nothing egalitarian about that!).

    Elites are as much a captive of their societies as the serfs are. There are two ways of thinking about a beehive. Is it the dictatorship of the queen who forces her helpless workers to do all the arduous stuff, or is the queen a captive egg factory being exploited by the workers to take on the burden of reproduction? The reality is somewhere in between. Elites typically have better diets, less physical demands, but the competition for these benefits expresses itself in higher rates of murder and intrigue (ever read much feudal history?). And as you pointed out previously the ruling class of medieval europe suffered the same range of maladies as their subjects. There isnt enough genetic or physiological separation of “elites” to allow them to truly exploit their underlings. Decisions which favor one class at the expense of the viability of the whole society are possible only in the short term.

    In contrast the agricultural societies of papua new guinea have never led to a centralised heirarchy and enslavement of the masses by elites. At most tribes have a leader who still tends his own crops. Why? Perhaps their root based staples are too hard to store centrally and monopolise. I think with agriculture the devil is in the detail.

    Shane-So there indeed were minor decreases in stature and longevity during the transition to agriculture, but perhaps not quite as dramatic as the impression I have gathered from this site.

    Jason-I can see how you would get that from the paper’s statements about “the considerable decrease in stature at this time.” Or this little gem, further down:

    The other pressure limiting stature and probably also fertility in early and developing farming times was deficiency of protein and of iron and zinc from ingestion of too much phytic acid [e.g., from grains] in the diet. In addition, new diseases including epidemics emerged as population increased, indicated by an increase of enamel arrest lines in Middle Bronze Age samples.

    Shane- Yet heights have returned to our normal levels, despite our diets becoming even more intensively agricultural? Was the problem with the neolithic not agriculture per se but imperfect agriculture? And a few sources cite that during periods of good harvests during the middle ages people had comparable average heights to modern and paleolithic humans (only to fall again during the beginning of the industrial revolution). Is it an unsurprising pattern that a fundamental change in lifestyle (agricultural and industrial revolutions) leads to a short term decrease in height then it returns to previous levels?

    Jason-The Neolithic Mortality Crisis is one of the most well-established facts of the archaeological record. When populations pick up agriculture, their statures drop by a good foot or more, and their life expectancy is generally halved–within a generation. Read “Disease and Death at Dickson’s Mounds.”

    Shane-The heights listed in the link I posted fall only six inches (9%) on average, not a foot (or more). And life expectancy by two years (6%). Hence my impression of exaggeration to support your case. The journal of population research has a good online article on the debate over the neolithic mortality crisis that I find more even handed in its consideration of the possibilities and in admission of possible weaknesses in the archeological data available. It presents a more complex interplay of diet, culture and disease.

    http://www.findarticles.com/p/articles/mi_m0PCG/is_2_20/ai_111014981

    Jason-”Increases in disease are argued as being equally likely causes for the change as the stress of antinutrients in their grain based diets”.

    GAH!!!!

    OK, hint: well-nourished people tend to get sick a lot less often than malnourished people.

    Shane-Yes but population density is just as important a factor in the spread of all diseases as host resistance. Hunter gatherers may have been even more malnourished than agriculturalists at times but their isolation and mobility may have minimised the impact of disease. At any rate improved aspects of nutrition with settling into agriculturalism must have at least partially balanced out any declines in nutrition and increases in disease. Otherwise it simply wouldnt have worked….if neolithic agriculture was the disaster you make it out to be then why did it not only arise but spread extensively? And why was the advent of agriculture independently reproduced several times around the globe? You claim it was for the benefit of elites while the Papua New Guineans have no elites. There must be another reason. If hunter-gatherers were physically superior to agriculturalists then why were they pushed into the areas unsuited for agriculture the world over?

    It appears to me that as a whole a society of agriculturalists (with or without elites) is capable of making more efficient use of the total energy resources (including effectively non-renewable ones like topsoil) available within certain fertile habitats than hunter gatherers by eating at a lower trophic level on average. Simple as that. No implications on the quality of life for the individual or long term sustainability of the practice though (middle eastern agriculture ruined the fertile crescent).

    Another couple of points to bring up. Lectins are sugar binding proteins that interact with the cell surface glycosides on everything from bacteria to humans. They occur in pretty much every living thing, even fruits and meat. They are an important component of the innate immune system of most living things. But the variability of each individual human’s glycosides in critical to the significance of the interactions, and partly explains why people have such varying reactions to their diets. Also you seem to put all grains and legumes in the same contraband catagory, when it is obvious from epidemiological studies that some (such as wheat or soy) seem a lot more problematic than others (such as rice or lentils).

    I dont deny that the recent exposure of humans to grass lectins is somehow associated with numerous health complaints. But the complete etiology of these disease states is not yet known, and a reasonable explanation for why a lot of these chronic illnesses and cancer have increased greatly only in the last fifty to one hundred years (and not just in the elderly) or so bears closer analysis. What roles have changing levels of other nutrients had? What about changing incidence of diseases from higher levels of crowding and access to medicines?(eg Crohns disease seems related to an absence of intestinal parasites). Or the increased use of pharmaceuticals like aspirin that strips the protective coating from the GI tract allowing lectin into the bloodstream or antibiotics that play havoc with the balance of our bodies natural flora? Or radical changes to how our food is processed like the bleaching of flour or pasteurisation of milk? Or exposure to innumerable synthetic chemicals?

    Im not sure by the tone of your responses so far if you are looking for the truth in these matters or confirmation that you are already right. Im not here to troll you, just to help myself think through these complex issues and bounce ideas off someone who has obviously done a lot of research in this area already. If you would prefer that I left you to your conclusions then let me know and I will continue my journey elsewhere.

    Thanks

    Shane

    Comment by Shane — 23 March 2006 @ 12:35 AM

  173. “Humans created the agriculturalist niche when it became viable compared to the hunter/gatherer option (probably due to excess hunting).”

    It’s unlikely that any type of food scarcity caused the shift from hunting & gathering to agriculture, since agriculturalists usually fall back on hunting & gathering during times of famine, but it has never been observed that starving h/g peoples start planting crops. Starving people dont plant seeds, they eat them. And h/g’s rarely starve anyway.

    Also, to my knowledge there has never been a solid case made of a h/g people group overhunting & depleteing their prey-base (overhunting is much more likely to happen with agriculturalists, actually).

    Besides, agriculture commonly arises in the most lush and fertile areas available to hunter-gatherers. More likely the birth of agriculture has to do with life becoming too easy for h/g’s rather than too hard. This is also supported by the fact that agriculture did not develop until our planet’s climate experienced a virtually unprescedented period of warmth and stability.

    Humans evolved in an unstable climate characterized by frequent ice-ages. We only gave birth to civilization when life suddenly got too damned easy for us.

    Comment by RedWolfReturns — 23 March 2006 @ 1:35 AM

  174. Im not sure by the tone of your responses so far if you are looking for the truth in these matters or confirmation that you are already right.

    I actively seek out intelligent people who disagree with me. That disagreement helps me refine my understanding. But it’s important that their points be valid ones. If they objections are simply the result of a total misunderstanding of evolution, or ecology, or something else, then it isn’t helpful. Then, it’s my turn to try to be helpful, by correcting such misconceptions. If they continue, at length, with banal arguments based on complete misconceptions, then it becomes quite aggrevating.

    That is the tone you’ve picked up: aggrevation. I’m not looking for confirmation that I’m right–far from it. I am looking for the truth in these matters. Your posts, so far, have been antithetical to that end, and having to rebut such long messages so devoid of any worthwhile content is time-consuming and very, very aggrevating. I’ll give it another go with this one, but I don’t think I can do another: I have other work to be about to push forward, rather than going in circles here.

    None of your arguments on evolution exclude the possible loss of function of the gene pathways in crops that lead to the formation of lectins or phytate.

    None of your arguments include a reason why they would. A grain without lectins is like a primate without eyes–it ceases to be grain at that point. This isn’t a minor change, like an edible almond. This is a complete rearrangement of what the plant is–and it’s something the plant has no incentive to do.

    Turning a gene off is the simplest thing for evolution to do.

    That is true. But you’re talking about turning off most of a grain’s genes. It’s not just lectins–there’s a whole host of various toxins in grain, because animals eating grain are eating the seed itself. There is no reciprocity for the grain the way there is with fruit.

    There is no fundamental difference between the genetic pathways leading to cyanide in almonds and the pathways leading to phytate or lectin in wheat.

    Yes, there is. The genetic pathways leading to cyanide in almonds are a closely-related complex. For grain’s lectins and phylate, you’re talking about genes that involve a great deal of the wheat genome. They’re tied in with the genes for seed formation itself. To turn them off, you’d also need to not form any seeds at all.

    If anything the loss of cyanide in almonds is less likely given their much longer generation time.

    No, because the genes for cyanide in almonds can be isolated. The genes for toxins in wheat cannot.

    But this is pointless. The simple fact of the matter is, wheat still has lectins, gluten, phylates, and a host of other neurotoxins. All the facts listed in the original article still stand: wheat consumption has a near 100% correlation with cancer; wheat is the most common food allergy in humans. As I wrote in the original article, lectins “can strip off protective mucous tissues, damage the small intestine, form blood clots, make cells react as if stimulated by a random hormone, stimulate cells to secrete random hormones, make cells divide at improper times, cause lymphatic tissues to grow or shrink, enlarge the pancreas, or even induce apoptosis.” In the article, I quoted this from the BMJ:

    Until recently their main use was as histology and blood transfusion reagents, but in the past two decades we have realised that many lectins are (a) toxic, inflammatory, or both; (b) resistant to cooking and digestive enzymes; and (c) present in much of our food. It is thus no surprise that they sometimes cause “food poisoning.” But the really disturbing finding came with the discovery in 1989 that some food lectins get past the gut wall and deposit themselves in distant organs.

    There is no doubt that civilization represents a significant drop in health, or that granivorism is a major cause of that. There is no doubt that the toxins in grain are responsible for “affluent malnutrition” and the “diseases of civilization.” Whatever theories you care to make about how that happens, it must acknowledge those facts, or it is a theory utterly divorced from reality. When your theory fails to account for known facts, that is the first sign that your understanding is flawed.

    Humans and their crops have already reached an adequate relationship (even though this may lead to some disadvantages to the individual compared to the former hunter-gatherer lifestyle) through changes in the morphology and biochemistry of the crop and in the behaviour of the humans. Human evolution of new phytases isnt the only possible solution to the problem.

    That can be true under certain definitions of “adequate.” Obviously it’s “adequate,” since we are here. My thesis was that it makes us sick. If it keeps us alive, but constantly sickly, it can be called “adequate.” It can allow us to live our lives; it cannot give us healthy lives. It’s enough to keep us alive, yes, but it is not a healthy life. The consequence of this incomplete adaptation is a life that is “nasty, brutish and short,” “compared to the former hunter-gatherer lifestyle.”

    Evolution of plants through hybridisation to form new species happens much faster than on the hundreds of thousands of years timescale.

    It does–but hybrdisation requires varieties to exist. There is no variety of grain that does not produce lectins or other poisons, so hybridisation can’t change that until such a variety emerges by some other means.

    I still contend that all food is a “poison” to some degree, but so long as the net benefits outweigh the net costs then it is a viable food.

    Firstly, that would eliminate grain, but secondly, that contention flies in the face of known facts: namely, all those foods that have no toxic content whatsoever. “Toxic” is always relative to a species. What is toxic to one species may not be to another. Species co-evolve, and a culinary relationship is often mutually beneficial. Some foods are poisonous, that is true, but you’re positing a gross over-generalization by saying that one particular kind of relationship that two species can have is the only relationship possible. There are many, many such relationships, and it’s actually very few that involve poisons.

    There is nothing “unnatural” about this kind of lifestyle….the organisms are just occupying a viable niche that comes with a price tag.

    I shy away from the term “natural,” because it’s so useless. What I claimed is that it was unhealthy, not that it was unnatural.

    Humans created the agriculturalist niche when it became viable compared to the hunter/gatherer option (probably due to excess hunting).

    That does not follow at all. In fact, agriculture can only be initiated in times of plenty. We’ve discussed the origins of agriculture at length, including a long discussion of why the popular theory you’re parroting here is utterly bankrupt. I suggest looking back to the earlier theses for a complete argument on why this point is a total non sequitur.

    Im still yet to see any evidence of people dying quickly from eating uncooked wheat. There are plenty of clinical examples of people becoming ill from eating uncooked legumes. Am I missing something here? If the data is out there I would love to see it. ChandraShakti brought this up in light of your claims of the acute toxicity of raw wheat but it hasnt been directly addressed yet.

    According to Dr. Ben Balzer, uncooked flour is quite toxic and will make someone who eats it quite ill.

    The Chamorros of Guam consume cycad flour as a staple after soaking to remove large quantities of cyanide.

    Cycad is a tree, not wheat. Like the almond, its cyanide can be isolated and processed out by a number of methods. Also, the health of Guamans has suffered from the use. Not enough to kill them off, but their quality of life has been significantly diminished.

    This brings up an important point. Hunter gatherers typically have very seasonal diets, so escape chronic exposure to any one class of nutritional toxins. Agriculture is characterised by the large scale storage of food, especially grains. Perhaps the real problem isnt wheat per se but the fact we consume it every day?

    There’s a lot of problems with wheat specifically, and the foods that hunter-gatherers eat tend to be much less toxic with much better adaptations, but yes–a crucial part of the equation is that hunter-gatherers don’t have staples the way we do. Reliance on any one food to such a degree is going to be detrimental, even if it was as healthy as dandelions. It’s even worse if, on top of that, the plant you pick is something you have no adaptation to, that’s full of poison. But you are quite right. Even when hunter-gatherers do eat mildly toxic food (and they sometimes do), it’s never in the constant quantities you get from wheat staples.

    Ironically this is the main advantage of agriculture. Storable excess meant people didnt instantly starve when they had a poor season, as you have been claiming. They typically had grain stores from last year to tide them over in bad seasons.

    Wrong. Those stores rarely lasted long enough. Foragers do not starve. We have no evidence–ethnographically or archaeologically–of foragers starving. Yet hardly a year goes by that we’re not faced with a major famine somewhere in the world, with large agricultural populations starving to death. Yes, farmers store food. That hasn’t changed much. The stores are eaten though, and the farmers starve to death, because they are utterly dependent on a few, closely-related crops. It doesn’t take much to wipe out their subsistence base. To starve out foragers would require the extinction of most life on earth.

    And you see one cup gives you half a days energy requirement and some protein. Surely you could eat wheat one day to get your energy requirements, then eat dandelions the next day to get your nutrients (highly oversimplified of course). Or does eating wheat actually suck nutrients out of your body? I got the impression phytates are already bound to calcium and iron in the wheat.

    The one thing wheat does do is give you cheap energy, in the forms of carbohydrates. I never suggested dandelions as a staple–I said it’s high in vitamins and minerals. Basically, with a field of dandelions, you’ll never need another vitamin supplement. The protein in grain is so negligible as to be hardly worth mentioning. What wheat gives you is cheap energy–in the form of carbs. But the human body is not terribly good at processing large amounts of carbohydrates. We’re more adapted to getting our energy from animal fat. This reliance on cheap carbohydrates increases the incidence of malnutrition. We don’t get the protein or micronutrients we need, but we’ve got lots of cheap energy that we’re not terribly good at processing. The body assumes it’s starving, and begins hoarding everything it can.

    Any source of concentrated energy in nature is going to be protected to some degree.

    Wrong. Fruit is a source of concentrated energy, too, but it’s adapted to make itself more appealing as a food source, with better taste and no poisons.

    Might it make more sense to educate people about the drawbacks of grains and legumes in order to encourage them to simultaneously diversify and reduce intake and ensure they are properly processed than to retreat to the stone age?

    That would be better than what they’re doing now, but the health issues are only one of the many ways in which our modern lives are far inferior to the stone age. More importantly, we won’t have a choice in the matter–the stone age is coming back, regardless of what we think of it. But eating grain is never healthy–not even in moderation. In moderation, it’s killing you less than if you eat it as a staple. That’s all you can really say about it.

    Insects and invertebrates are the dominant force in terrestrial habitats. The vast majority of animal biomass is insect, and as such most of the adaptation of any plant is geared much more toward insect pressures than vertebrate or human pressures.

    Another non sequitur. Have you never heard of ecological niches?

    Every vertebrate could disappear today and the earths ecosystems would continue on without significant change.

    Wrong. For one thing, it would probably mean the extinction of white pine trees, which have come to rely on squirrels to a significant extent. Ecology is all about interdependence, and it’s a lot more complicated than the vastly oversimplified model you present.

    As such the health effects of plant defense molecules on humans are more or less incidental.

    From false premises, one can hardly expect a reasonable conclusion.

    Actually we have. We variously grind, soak, germinate, roast and bake grains to reduce the amount of defense compounds. It may or may not be adequate, but a birds digestive enzymes face comparable practical limitations.

    No. Our cultural adaptations don’t even touch lectins, though they destroy most of the other neurotoxins. Bird’s enzymes quite effectively block and neutralize lectins. Humans suffer far more effects from lectins than birds, proving beyond a doubt that birds are far more effective at neutralizing lectins than we are–i.e., our adaptation is incomplete, and we suffer from that with poor health.

    Actually you are misquoting me here. I said acidity OR sugar content. Otherwise your second and third sentences clearly contradict each other. Caries are well established to be caused by specific strains of bacteria that break down sugars to produce a localised acid reaction. I have gotten the impression from dentists that gluey foods like wheat flour greatly accellerate this process. You didnt directly address this point as it relates to modern industrial flour. General erosion of enamel across the whole mouth does occur from the excess consumption of highly acidic, generally processed, foodstuffs. But the direct reaction of human amylase with starches is not a significant cause of caries or general enamel wear. Minor clarification.

    Hmmm. OK, you were right on that point.

    Storage of food is a simple and reliable mechanism to buffer the effect of poor seasons, and is easier to achieve in a fixed agricultural society with grain based staples.

    It is also inadequate. Or is it your position that famines are mythical?

    The seasonality of hunter-gatherer diets is usually linked to regular cycles of varying food availability and quality.

    That is true. Forager diets fluctuate between “plenty” and “adequate,” but they have never dipped into starvation–the way that farmers’ diets regularly do.

    Hunter-gatherers surely do suffer periods of inadequate food, but their smaller populations and lack of written records make them less obvious.

    To be clear, “inadequate food” here means “less than we used to have.” We’ve never found foragers with signs of malnutrition. Meanwhile, MOST farmers evince signs of chronic malnutrition.

    If only it were so simple. African Pygmies grow like normal humans but have no pubertal growth spurt, which apparently isnt a result of malnutrition.

    No, it’s not–that’s why I never cited a specific height, such as you did. It must be compared to the mean for their population, which takes into account adaptations to the environment. That’s what studies of nutrition have done, and that’s why Dickson’s Mounds is such an important site.

    The lines between hunter-gatherer and agriculturalist are not clear cut.

    For the most part, they are; the line is sedentism. In theory, perhaps you could have a larger spectrum, but in reality, that shift happened very quickly.

    The reality is somewhere in between. Elites typically have better diets, less physical demands, but the competition for these benefits expresses itself in higher rates of murder and intrigue (ever read much feudal history?).

    I agree. But the fact remains that the Agricultural Revolution was by and for the elites.

    And as you pointed out previously the ruling class of medieval europe suffered the same range of maladies as their subjects.

    I didn’t quite say that. Rulers lived into their 60s; serfs were lucky to live half that long.

    There isnt enough genetic or physiological separation of “elites” to allow them to truly exploit their underlings.

    Exploitation doesn’t require any genetic or physiological seperation. It just requires an armed force to back up your demands.

    Decisions which favor one class at the expense of the viability of the whole society are possible only in the short term.

    Indeed–and as we’ve seen, agriculture is already on the brink of collapse after a mere 10,000 years, making it one of the shortest term societies to ever flash in the pan.

    In contrast the agricultural societies of papua new guinea have never led to a centralised heirarchy and enslavement of the masses by elites. At most tribes have a leader who still tends his own crops. Why? Perhaps their root based staples are too hard to store centrally and monopolise.

    That is why the Big Men have never centralized very effectively, but Big Men are still very much elites–and agriculture in New Guinea is very much because of them.

    Yet heights have returned to our normal levels, despite our diets becoming even more intensively agricultural?

    Only in the past fifty years, and only in some Western countries. Greece and Turkey still haven’t caught up. In fact, it’s only happened in those areas rich enough to eat less wheat, and eat more meat and vegetables.

    Was the problem with the neolithic not agriculture per se but imperfect agriculture?

    No, the problem is with agriculture itself, of any kind. We have only approached our former stature insofar as we have been able to move away from agriculture.

    And a few sources cite that during periods of good harvests during the middle ages people had comparable average heights to modern and paleolithic humans (only to fall again during the beginning of the industrial revolution).

    a.k.a., “affluent malnutrition,” just like people today

    Is it an unsurprising pattern that a fundamental change in lifestyle (agricultural and industrial revolutions) leads to a short term decrease in height then it returns to previous levels?

    It took 10,000 years to return to previous levels, and we still suffer from a general sickliness.

    The journal of population research has a good online article on the debate over the neolithic mortality crisis that I find more even handed in its consideration of the possibilities and in admission of possible weaknesses in the archeological data available.

    I’ve read that. I found it quite unfounded. It is very much the minority opinion on the matter.

    Hunter gatherers may have been even more malnourished than agriculturalists at times but their isolation and mobility may have minimised the impact of disease.

    There’s certainly no indication to back up the idea that foragers have ever been more malnourished than farmers.

    At any rate improved aspects of nutrition with settling into agriculturalism must have at least partially balanced out any declines in nutrition and increases in disease. Otherwise it simply wouldnt have worked….if neolithic agriculture was the disaster you make it out to be then why did it not only arise but spread extensively?

    Did you read the other theses? It spread because it was able to field armies of sickly farmers–and a thousand sickly farmers can easily take down one healthy hunter-gatherer. Civilization arose out of coercion, led by elites gathering power for themselves. It has never benefitted anyone but the elites–and even their lives have ended up being less than they once were. There were no improved aspects of nutrition that came from settling into agriculture. Agriculture was a difficult, dangerous, and unhealthy lifestyle that was forced onto larger populations by a small elite through various forms of coercion.

    And why was the advent of agriculture independently reproduced several times around the globe?

    Because it became possible, and ambition is not geographically confined.

    You claim it was for the benefit of elites while the Papua New Guineans have no elites.

    You sure have a funny definition of “elites,” if it doesn’t include Big Men.

    There must be another reason.

    Only if you fail to understand the nature of the elite.

    If hunter-gatherers were physically superior to agriculturalists then why were they pushed into the areas unsuited for agriculture the world over?

    Because a thousand sickly farmers can easily take down one healthy hunter-gatherer.

    It appears to me that as a whole a society of agriculturalists (with or without elites) is capable of making more efficient use of the total energy resources (including effectively non-renewable ones like topsoil) available within certain fertile habitats than hunter gatherers by eating at a lower trophic level on average. Simple as that.

    Except it isn’t true. Agriculture is a net loss of energy, always. Modern, industrial agriculture uses 10 calories of work input for every calorie of food output. More traditional forms are less extreme, but still negative. Either draft animals or petroleum are needed to make up the difference.

    No implications on the quality of life for the individual or long term sustainability of the practice though (middle eastern agriculture ruined the fertile crescent).

    Bollocks. Shorter life-span, more disease, less health, more starvation, more oppression, more war–all across the board. This was covered in a number of theses.

    Comment by Jason Godesky — 23 March 2006 @ 12:33 PM

  175. “It appears to me that as a whole a society of agriculturalists (with or without elites) is capable of making more efficient use of the total energy resources (including effectively non-renewable ones like topsoil) available within certain fertile habitats than hunter gatherers by eating at a lower trophic level on average. Simple as that.”

    Me: The problem is that the earth is not populated by “energy rescources” it is populated by people (and not just human people; rock-people, plant-people, animal-people, etc). Old-Way cultures realize this, and we civilized people tend to call this realization “animism,” but really it is just the acnowledgement of a respectful relationship between free and equal beings.

    And actually the total “energy resouces” of a particular bioregion are much higher when that bioregion is populated by hunter-gatherers than by agriculturalists (especially industrial agriculturalists). This is because hunter-gatherers only take what they need (leaving the rest of life unmolested), and give back all they take. Agriculturalists, in their struggle for domination, nearly always kill more life (”weeds” and “pests”) than they “grow”, and tend to wreak havoc on the web of life on land that they clear and attempt to control. This is also why they live more precarious lives in terms of starvation. Living in an unhealthy bioregion makes life precarious. Living in a healthy bioregion makes life more secure.

    One could say that a slave owner (or an employer) makes more “efficient” use of the “total energy resources” of a human than those of us who would rather relate to each other in the context of an extended clan-family where people care for each other and relate freely with each other, but we could hardly say this is a true advantage in terms of having secure & healthy relationships.

    Healthy relationships make for secure & healthy people.

    Unhealthy relationships make us sick.

    Doesn’t matter a whole lot whether these relationships are between humans and other humans or between humans and plants or animals.

    Comment by RedWolfReturns — 23 March 2006 @ 1:59 PM

  176. Re: acorns (which seems so funny to return to after the last 2 huge comments).

    I’m sticking with Red Wolf on this one. While I belief it was a ‘dominant’ factor (i.e. eaten a lot), I’m not sure it can be compared with modern agriculture and so I won’t :)

    Staples for farming.

    Diet for everyone else.

    Best

    Bill Maxwell

    Comment by Bill Maxwell — 23 March 2006 @ 2:21 PM

  177. Agreed these posts are becoming a little too labyrinthine, so I will try to reduce things down to what I think are the most contentious points. The difference between stimulation and aggravation is a fine one. I guess I can’t resist playing devil’s advocate and pointing out what appears to be a shaky assertion here and there.

    There is reciprocal benefit between wheat and humans because we cultivate it. The wheat is more “successful” in a reproductive sense if the creatures protecting it are healthy too. The same factors that converted sour wild crab apples into modern sugary dessert varieties act upon wheat.

    Low phytate corn has been bred recently also.

    http://www.vegrains.org/english/varieties_lowphytate.htm

    The level of phytate can be halved without changing the total nutrient content or the viability of the seed.

    On what do you base your assertion that lectins are essential components of seeds and their development?

    Lectins are both a defense mechanism and a minor source of protein for the growing seed. Lectins as proteins and are the product of linear metabolic pathways. Gene goes to preprotein goes to processed protein (with or without glycosylation)(review in Plant Cell, CHRIS PEELS 3 : 1 1991). Any of the steps along the way can be down regulated or turned off to reduce or eliminate the final product. Sour wild apples have tannins in them, which are the product of a more complex and resilient branched metabolic pathway with multiple starting and end points, yet we managed to breed dessert apples with very little tannin in them.

    Even if lectins cant be completely left out of seed development their target selectivity can be changed. Lectins are highly selective saccharide binding proteins. There are high levels of diversity of human cell surface saccharides and in crop lectins. Since the relevant targets of crop seed lectins are more likely seed eating insects than humans there is plenty of scope for the specificity of crop lectin and human saccharide to evolve in such a way that they no longer interact. Admittedly crops are probably under greater selective pressure to give the biggest yield possible, and the eventual health effects on senior citizens is probably low on the list, hence the persistance of some phytate and lectin effects from modern crops.

    In an ecological situation these seed defense molecules have very specific targets as they have to get through the many defenses of the target consumer. I gave an example with the sunflower trypsin inhibitor…in vitro it is the most potent trypsin inhibitor known but in vivo in humans and vertebrates sunflowers are not observed to cause any trypsin deprivation effects.

    The idea that “a grain without lectin is like a primate without eyes” was interesting given how readily cave dwelling vertebrates (mostly fish and amphibians) rapidly and repeatedly lose functional eyes well within tens of thousands of years in environments where there is no selective pressure to keep them. A blind cave fish is most definitely still a fish.

    “All the facts listed in the original article still stand: wheat consumption has a near 100% correlation with cancer; wheat is the most common food allergy in humans.”

    So wheat correlates with cancer…but that doesnt come close to showing causation. Many cancers are already known to be caused by persistant viral infections (cervical cancer, liver cancer, leukemia). Perhaps eating wheat correlates with crowding, and crowding correlates with a higher rate of viral infection? Some lectins have been shown to influence cell cycling, but most of the ones we eat are primarily insecticidal or fungicidal.

    “hybrdisation requires varieties to exist. There is no variety of grain that does not produce lectins or other poisons, so hybridisation can’t change that until such a variety emerges by some other means.”

    Turning off genes (cyanide in almonds, tannin in apple) doesnt require a pre-existing variety already lacking the metabolite. There is no wild sweet almond, yet the simple mutation occurs frequently enough to be selected by humans. Trypsin inhibitors have been bred out of legumes, and phytate reduced in corn. There is no arbitrary end to the potential of this process to further improve the suitability of crops for human consumption, but confounding factors like crop-insect interactions probably hold it back to some extent.

    “According to Dr. Ben Balzer, uncooked flour is quite toxic and will make someone who eats it quite ill.”

    The best you could find was a general practitioner’s personal web page that is also advocating the paleolithic diet? This only strengthens my impression that your assertion that raw flour is acutely toxic is unsubstantiated and alarmist.

    Below is an example of a peer reviewed study on a neurotoxin in african peas. Endless animal studies of soy on rats exist. Clinical cases of lima bean poisoning are around too. Something of this calibre should be out there for raw grain flours. At least a clinical report of some dumb stone mill using hippy going to hospital with internal bleeding from eating raw wheat heamagglutin?

    TEKLEHAIMANOT R, ABEGAZ BM, WUHIB E, et al.
    PATTERN OF LATHYRUS-SATIVUS (GRASS PEA) CONSUMPTION AND BETA-N-OXALYL-ALPHA-BETA-DIAMINOPROPRIONIC ACID (BETA-ODAP) CONTENT OF FOOD SAMPLES IN THE LATHYRISM ENDEMIC REGION OF NORTHWEST ETHIOPIA
    NUTRITION RESEARCH 13 (10): 1113-1126 OCT 1993

    You dont correct your previous assertion that no hunter gatherers rely on grains (grass or legume) to provide a significant portion of their diet. The example of central Australian Aborigines is one, and I would expect among the diverse lifestyles of other indigenous peoples grain and seed eating is common. This extends the evolutionary exposure of humans to these classes of foods much further back than 10 000 years. If the negative effects are truely allergenic in nature isnt the dose dependence less straight forward?

    The idea of agriculturalists suffering because of constant exposure to the same set of food toxins made me realise another point. Most agricultural societies have ritualised food consumption patterns which involve excluding some or all of their food sources for certain periods of time. Fasting is a feature of pretty much all societies except our modern one.

    If hunter gatherers dont starve then other factors must influence their mortality. Wouldnt a reduced investment in acquiring food translate into more time to defend territory with violence? Beyond that for one tribe within a niche fertility and mortality must be limited by nutrition, disease, infanticide. Whether or not this leads to an improved quality of life relative to agricultural options for the individual is probably highly dependent on the exact environment they occupy. I think blanket statements about the quality of life of hunter-gatherers and agriculturalists are too simplistic. Examples of longevity and health in places such as Bama in China provide an interesting counterpoint.

    Wheat is cheap energy. An apple is cheap energy too, except it is mostly even cheaper water. In our current society the energy content of food is of very little value due to its dependence on grossly undervalued petroleum. Energy in food hasnt always been so poorly regarded. As oil peaks we will have to leave behind the current disdain for calories.

    In some ways the current paleo diet philosophy is just an echo of the low fat fad of the 1990s. Consumption of total fat correlates with disease. Therefore eliminate/reduce total fat intake (with important details like EFAs and fat soluble vitamins overlooked). Total lectin consumption correlates with disease. Therefore eliminate/reduce lectins from the diet (while missing that most are neutral and some have health benefits). Both are enormously diverse and ubiquitous classes of compounds. We are oscillating wildly between demonising various catagories of food, based on incomplete understanding and at the behest of diet book hucksters out to make a quick buck out of a one paragraph concept.

    “But the human body is not terribly good at processing large amounts of carbohydrates.”

    Just looking at the different responses of western humans and indigenous populations to a western diet shows that westerners are at least a lot better at it. The corollory is that putting westerners on a genuine indigenous diet could have equally negative effects. How would you assure people thinking of going on a paleo diet that it is perfectly safe? Are there any actual clinical studies completed yet?

    “Fruit is a source of concentrated energy, too, but it’s adapted to make itself more appealing as a food source, with better taste and no poisons.”

    There are examples of fruits that are poisonous to humans, even those that are quite palatable like chinaberries. Again the devil is in the details.

    “But eating grain is never healthy–not even in moderation. In moderation, it’s killing you less than if you eat it as a staple. That’s all you can really say about it.”

    So if eating x amount of grain means you have a 50% chance of getting arthritis by the age of 60 is unacceptable, then is eating x/2 amount acceptable if you probably wont get arthritis by the time you are 120? A simplification but you get the general point.

    Shane-Insects and invertebrates are the dominant force in terrestrial habitats. The vast majority of animal biomass is insect, and as such most of the adaptation of any plant is geared much more toward insect pressures than vertebrate or human pressures.

    Jason-Another non sequitur. Have you never heard of ecological niches?

    Shane-Just read through lists of plant defense compounds. Most of them are antimicrobial or insecticidal and have no measurable effects on humans (like the sunflower trypsin inhibitor). The ones that have an effect on humans tend to be the weaker, broad spectrum ones that slow down any organism. Can you think of a plant that is poisonous to mammals but freely eaten by many insects?

    Jason-”Wrong. For one thing, it would probably mean the extinction of white pine trees, which have come to rely on squirrels to a significant extent”

    Shane-Pine trees have been around a lot longer than any vertebrates and wouldnt likely die out from lack of squirrels. Only bird/bat pollinated/distributed angiosperms and ectoparasites would suffer from the loss of vertebrates from the world.

    Jason-”No. Our cultural adaptations don’t even touch lectins, though they destroy most of the other neurotoxins”.

    Good review here

    http://www.gak.co.jp/TIGG/41MR1e.html

    Also covers the many beneficial effects of many lectins. They are too large and ubiquitous a group of compounds to be universally bad or universally indestructable.

    “most plant-based food/feed stuffs contain appreciable amounts of lectins, some with striking biological activities on gut function. Although in general lectins are more resistant to heat-denaturation than other plant proteins, legume lectins can be inactivated by moderately prolonged cooking.”

    and

    “Although it is known from animal studies that lectins can damage the gut and this may lead to various nutritional disorders (for refs., see 2), it is not generally appreciated that this only occurs at relatively high lectin concentrations in the diet. However, lectins can also have beneficial effects particularly at low dietary inclusions, and these may find uses in medical/clinical practice”

    and

    “There is, in fact, some evidence that most lectins given orally are immunogenic (28). The recent finding of high titre anti-banana lectin (BanLec-1) IgG4 antibodies in pooled human blood further supported this suggestion (29).”

    Even healthy, innocent, paleo bananas have nasty lectins that induce an immune response?

    Early Jason-Bullocks. Healthy, adequately fed humans have a given height. If they’re sick or malnourished, they’re shorter.

    Recent Jason-No, it’s not–that’s why I never cited a specific height, such as you did. It must be compared to the mean for their population, which takes into account adaptations to the environment. That’s what studies of nutrition have done, and that’s why Dickson’s Mounds is such an important site.”

    So normal human height can and does change to some degree with selective pressure? You would agree that malnutrition isnt the only mechanism to change average height? If so then you need to find a reliable way of disentangling the effects of changing genetic potential and nutrition, which seems pretty hard to do with just dry skeletal remains.

    The real issue here however is that height is more a result of epigenetic change. The genes can stay the same generation after generation, but changes in nutrition in one generation change levels of gene activation in the next, changing the height to which a fully nourished individual will grow. This isnt malnutrition per se. That only happens to the individual when the metabolic trajectory laid down during embryogenesis mismatches the conditions found in the environment. This suggests that modern humans could take a few generations to adapt back to a genuinely hunter-gatherer diet, which might express itself through reduction in female fecundity.

    Jason-”Indeed–and as we’ve seen, agriculture is already on the brink of collapse after a mere 10,000 years, making it one of the shortest term societies to ever flash in the pan.”

    Shane-Actually the middle eastern and new world models of agriculture appear to be unsustainable, probably in response to climate change as much as anything. Papua new guinean agriculture, and to a lesser degree traditional asian agriculture, both appear to be sustainable as they place a greater emphasis on nutrient recycling.

    Jason-No, the problem is with agriculture itself, of any kind. We have only approached our former stature insofar as we have been able to move away from agriculture.”

    Shane-Then what about the period of stability in the middle ages where heights returned to their “fully nourished” maximum without agriculture being propped up by petroleum? There are examples of agricultural societies where people seem exceptionally long lived and healthy, like Bama in China. If these are genuine exceptions can they give us a more practical route to improving our health than abandoning agriculture?

    Jason-”Agriculture is a net loss of energy, always. Modern, industrial agriculture uses 10 calories of work input for every calorie of food output. More traditional forms are less extreme, but still negative. Either draft animals or petroleum are needed to make up the difference.”

    Shane-Ancient american agriculturalists used neither draft animals nor petroleum. Same for papua new guinean agriculture. All manual human labor. If agriculture gave a net loss of energy wouldnt that lead to shrinking agriculturalist populations before the fossil fuel age?

    Jason-Bollocks. Shorter life-span, more disease, less health, more starvation, more oppression, more war–all across the board. This was covered in a number of theses.

    Shane-Yet somehow all these negatives are outweighed by increasing rates of reproduction. From a cold Darwinian point of view the agriculturalists have made the most of the recent interglacial. I agree there have been trade offs in terms of individual autonomy and health and longer term environmental impacts though. If agriculturalism leads to the greatest reproductive success then we are not given the option to choose a hunter gatherer lifestyle as anyone else taking the ag option can out compete us.

    One last ingredient to throw into the pot. Have you read much about the impact on the health and longevity of pretty much all animals on sustained nutrient deprivation? Basically if you reduce calories by around 40% of unlimited consumption you increase lifespan by 30-50% with an apparent slowing of aging. Humans seem to follow the trend. This alone could explain a lot of the differences between paleo and neo diets, ie degenerative disease as a symptom of heightened fertility and accelerated aging rather than specific dietary reactions.

    Also are there any good statistics on the rate of arthritis etc in people who have been on a strict wheat/oat/rye free celiac diet their entire life?

    Comment by Shane — 24 March 2006 @ 2:58 AM

  178. There is reciprocal benefit between wheat and humans because we cultivate it. The wheat is more “successful” in a reproductive sense if the creatures protecting it are healthy too. The same factors that converted sour wild crab apples into modern sugary dessert varieties act upon wheat.

    Yes, but that doesn’t really work if the plant, in so doing, doesn’t produce any seeds.

    On what do you base your assertion that lectins are essential components of seeds and their development?

    Primarily the fact that we’ve been trying to do exactly what you’re suggesting for 10,000 years, it all works in theory and has worked in other applications, yet it still hasn’t worked. Obviously, something is “up.”

    …yet we managed to breed dessert apples with very little tannin in them.

    Yes we did. But we still haven’t bred out the neurotoxins in wheat. Why?

    Even if lectins cant be completely left out of seed development their target selectivity can be changed.

    Then why haven’t we?

    Admittedly crops are probably under greater selective pressure to give the biggest yield possible, and the eventual health effects on senior citizens is probably low on the list, hence the persistance of some phytate and lectin effects from modern crops.

    I would hardly call such drastic rises in mortality, “eventual health effects on senior citizens.” Our wheat-based diet is the cause of nearly all of our diseases. Hell, you don’t even get lice unless you eat wheat.

    In an ecological situation these seed defense molecules have very specific targets as they have to get through the many defenses of the target consumer.

    That’s just not true. They have to defend against many different animals; the best bet is just to get a general dose of fuck-you-up and deal it out liberally to anyone that eats you. If you have a very specific target, great, you’re protected from deer. Every other critter in the forest thinks you’re delicious.

    The idea that “a grain without lectin is like a primate without eyes” was interesting given how readily cave dwelling vertebrates (mostly fish and amphibians) rapidly and repeatedly lose functional eyes well within tens of thousands of years in environments where there is no selective pressure to keep them. A blind cave fish is most definitely still a fish.

    Yes, but a primate without eyes is no longer a primate, since primates are kind of defined by their binocoluar vision.

    So wheat correlates with cancer…but that doesnt come close to showing causation.

    No, it’s the lectins that show causation.

    Many cancers are already known to be caused by persistant viral infections (cervical cancer, liver cancer, leukemia). Perhaps eating wheat correlates with crowding, and crowding correlates with a higher rate of viral infection?

    To some degree. More often, viral causes of cancer are exacerbated by the general sickliness of malnutrition which results from a wheat-based diet, as well as the fact that your body’s chronically poisoned. That reduces your immune response.

    MORE often, lectins cause random mutations in cells, vastly increasing the occurence of cancer cells beyond the ability of your weakened immune system to keep up. Your body is constantly creating cancer cells. In healthy people, the immune system is able to keep up. If the rate of carcinogenesis is increased, though, or if the immune system is weakened, you end up with a tumor. A diet of wheat does both–lectins increase the rate of carcinogenesis, and wheat weakens the immune system. The result is that you’ll never find a forager with cancer. We’re still looking for one.

    You dont correct your previous assertion that no hunter gatherers rely on grains (grass or legume) to provide a significant portion of their diet. The example of central Australian Aborigines is one, and I would expect among the diverse lifestyles of other indigenous peoples grain and seed eating is common.

    I haven’t “corrected” it because the original assertion is correct. Aborigines don’t rely on grasses or legumes as a major food source. More than half their food is meat.

    If hunter gatherers dont starve then other factors must influence their mortality.

    That’s a common misconception about natural selection. It rarely gets as far as die-off. That’s only the most extreme measure. More often, natural selection has less to do with die-off than it does with out-breeding. Even a 1% higher birth rate in one population vs. another will overtake the whole population in relatively short order. There were times when foragers had harder times than others. When Richard Lee visited the !Kung and recorded them for working 2 hours a day, it was in the midst of the worst drought in living memory, in the Kalahari desert. While their Bantu neighbors starved by the thousands, they complained of how long and hard they had to work–but no one went hungry. All the same, they were working more than usual. That leaves less time for reproduction. The birth rate dips. That’s all that’s needed to curb population size. You don’t need people starving to death–you just need people working a little longer, which means they’re having less sex.

    Wouldnt a reduced investment in acquiring food translate into more time to defend territory with violence?

    Conceivably, but why on earth would a forager be interested in defending territory? No examples exist in the ethnographic record, and there’s no archaeological evidence, either.

    Beyond that for one tribe within a niche fertility and mortality must be limited by nutrition, disease, infanticide.

    Those are only the most extreme limits. Most populations are more limited by time and birth rate.

    Whether or not this leads to an improved quality of life relative to agricultural options for the individual is probably highly dependent on the exact environment they occupy. I think blanket statements about the quality of life of hunter-gatherers and agriculturalists are too simplistic.

    It does vary from region to region, but as we’ve discussed elsewhere, the quality of life for foragers in the worst environments imaginable (the ones they occupy now) is superior to the very best agricultural lives possible (namely, ours). Their life expectancy is only a little lower, but they suffer less disease and live healthier lives, with less work and stress.

    Examples of longevity and health in places such as Bama in China provide an interesting counterpoint.

    How so? That’s almost certainly a genetic factor when it’s such a localized “hot spot.”

    Wheat is cheap energy. An apple is cheap energy too, except it is mostly even cheaper water. In our current society the energy content of food is of very little value due to its dependence on grossly undervalued petroleum. Energy in food hasnt always been so poorly regarded. As oil peaks we will have to leave behind the current disdain for calories.

    No, what I meant by cheap energy is that it’s easily broken down into energy by the body. Unfortunately, that’s also somewhat like low-grade fuel in a car. It works, but not well. Glycemic spikes, the threat of diabetes, increased fat storage and other consequences of such “cheap energy” greatly mitigate its EROEI. The human body is really better equipped to using animal fat for energy than plant carbohydrates.

    In some ways the current paleo diet philosophy is just an echo of the low fat fad of the 1990s.

    Considering that the particular one I’m following advocates more fat, I’m not sure I follow. The “low fat” fad of the 90s was based on the idea that reductionism was adequate to understand nutrition. Paleo assumes that it isn’t sufficient, so we should just eat the things that worked for two million years.

    Total lectin consumption correlates with disease. Therefore eliminate/reduce lectins from the diet (while missing that most are neutral and some have health benefits).

    That’s not the rationale at all. That’s the attempt to justify the reasoning in reductionist terms, and explain why it happens. The rationale is simply what happened. Humans used to be healthy. Then we started eating grains, and we got sick and fat and dumb. How about we try not eating grains anymore?

    Just looking at the different responses of western humans and indigenous populations to a western diet shows that westerners are at least a lot better at it.

    That is true. Do anything, no matter how stupid, for 10,000 years, and you will develop some adaptations for it, however meager. Unfortunately, the adaptations still are nowhere close to catching up with what a monumentally stupid idea it was in the first place.

    The corollory is that putting westerners on a genuine indigenous diet could have equally negative effects.

    Another non sequitur. You start with a population perfectly adapted to a forager diet. A small number of them start eating grains. After 10,000 years, they’re now, what, 1% granivore-adapted and 99% omnivore-adapted? If these people go back to the forager diet they ate 10,000 years ago, they’re still far more adapted to that than they are to a diet of wheat. The effects of a wheat diet are profoundly negative. It does not follow that the diet we’re best adapted to must be equally negative.

    How would you assure people thinking of going on a paleo diet that it is perfectly safe? Are there any actual clinical studies completed yet?

    Our entire civilization is built on grain, so there’s a certain amount of vested interest in “poisoning the well,” as it were. You can find people arguing that paleo is bad, just like you can find dairy farmers lobbying the FDA to make milk one of the four basic food groups. But you can’t argue with evolution, and most studies confirm that paleo’s a good idea. In most studies, it’s taken as a given. I linked to quite a few PubMed citations in “Going Paleo.”

    There are examples of fruits that are poisonous to humans, even those that are quite palatable like chinaberries. Again the devil is in the details.

    Yes, and there are fruits that aren’t–which proves that things are a lot more complicated than your gross oversimplification that all plants are poisonous.

    So if eating x amount of grain means you have a 50% chance of getting arthritis by the age of 60 is unacceptable, then is eating x/2 amount acceptable if you probably wont get arthritis by the time you are 120? A simplification but you get the general point.

    Yes–in sufficiently small quantities, grains can be non-lethal. I said that. But it’s never good for you. THAT’s the point. ANY time you eat ANY grain, you are compromising your health for a quick bit of cheap energy.

    Pine trees have been around a lot longer than any vertebrates and wouldnt likely die out from lack of squirrels.

    They’ve adapted and become more specialized. Once upon a time, they’d have been fine. Take away squirrels now, and pine trees will die. Lots of plants now rely on vertebrates. Ecology isn’t so simple; remove any category of animals–even just a single species–and there will be cascades throughout the whole ecosystem.

    Also covers the many beneficial effects of many lectins. They are too large and ubiquitous a group of compounds to be universally bad or universally indestructable.

    I didn’t say they were universally bad–I said they were random. They have massive effects on the body, comparable to hormones, but they’re foreign, so they neither know nor care what that massive effect is. It’s like firing blindly into a crowd, and then deciding it’s a good idea because you didn’t happen to kill anybody–just a few wounded.

    So normal human height can and does change to some degree with selective pressure? You would agree that malnutrition isnt the only mechanism to change average height?

    You can see adaptation occuring over a long period of time. That’s not what we’re talking about. We’re talking about one group of people, and their direct descendants, often over a single generation.

    If so then you need to find a reliable way of disentangling the effects of changing genetic potential and nutrition, which seems pretty hard to do with just dry skeletal remains.

    It’s actually quite easy. If it happens gradually over a few thousand years, it’s adaptation. If it happens from one generation to the next, and is accompanied with other signs of malnutrition, it’s malnutrition. Malnutrition leaves all kinds of skeletal evidence–not just height. That’s exactly what we find. Again, look up “Disease and Death at Dickson’s Mounds,” it’s full of those details.

    he genes can stay the same generation after generation, but changes in nutrition in one generation change levels of gene activation in the next, changing the height to which a fully nourished individual will grow.

    No, wrong. It hasn’t a thing to do with genes. Periods of starvation means that the body lacks the chemical components required to grow. The genes are there, but there’s nothing to do it with.

    Papua new guinean agriculture, and to a lesser degree traditional asian agriculture, both appear to be sustainable as they place a greater emphasis on nutrient recycling.

    Bollocks. China’s on the verge of breakdown right now, too. New Guinea technically falls into the realm of horticulture, not agriculture.

    Then what about the period of stability in the middle ages where heights returned to their “fully nourished” maximum without agriculture being propped up by petroleum?

    I didn’t say anything about petroleum–you did. In the Middle Ages, a brief good spot gave them the same “affluent malnutrition” we suffer from today. There’s lots of ways to achieve that–petroleum’s just one of them.

    There are examples of agricultural societies where people seem exceptionally long lived and healthy, like Bama in China. If these are genuine exceptions can they give us a more practical route to improving our health than abandoning agriculture?

    No, because such localized examples evince specialized genetic populations, not a lifestyle that could be exported. My guess is that they’d live even longer as foragers. Hell, there are stories of foragers living into their second century.

    Ancient american agriculturalists used neither draft animals nor petroleum. Same for papua new guinean agriculture. All manual human labor. If agriculture gave a net loss of energy wouldnt that lead to shrinking agriculturalist populations before the fossil fuel age?

    That’s why New Guinea’s cultures count as horticulture–they’re below the point of diminishing returns. In the New World, they used other “cheats.”

    Did you read any of the other theses? Because these points have been discussed at length already.

    Yet somehow all these negatives are outweighed by increasing rates of reproduction. From a cold Darwinian point of view the agriculturalists have made the most of the recent interglacial. I agree there have been trade offs in terms of individual autonomy and health and longer term environmental impacts though. If agriculturalism leads to the greatest reproductive success then we are not given the option to choose a hunter gatherer lifestyle as anyone else taking the ag option can out compete us.

    Again, have you read any of the other theses? That was largely true in the past. Agricuture traded long-term sustainability for short-term competitive advantages, allowing it to wipe out all the other alternatives right before its clock was up and it imploded in on itself. Once that implosion–collapse–begins, the rules reverse themselves. Sustainability becomes the key to survival, and the opportunity to live as a forager opens up. Indeed, it becomes the most strategy with the greatest likelihood of success.

    Comment by Jason Godesky — 24 March 2006 @ 11:15 AM

  179. Ive tried to structure and slim down my response. This is very broad topic and I think there are a lot of significant counterpoints your theses have left out of consideration.

    Nature of lectins

    Your assertion that raw wheat flour is acutely toxic remains unsubstantiated. Here is a better example of a report of the toxicity of raw red beans.

    “..ingestion of high doses of PHA causes acute nausea followed by vomiting and diarrhoea? Critical Reviews in Plant Sciences, Damme, 1998, 575.

    Grain lectins are 0.2-0.5% of total protein content. They bind N-acetylglucosamine, the major component of chitin in insects and fungi. Their probable role is in defense but their behavior is complex and may just be signaling molecules for the plants actual defenses. Legume lectins in contrast are 0.1-50% of a seeds total protein, consistent with their sometimes dramatic effects when improperly processed.

    Resilience of any protein makes it more immunogenic. Nuts are bristling with stable proteins and associated allergies, but they also occur everywhere in nature. I concede a possible role of wheat in arthritis, but other factors may be involved. A healthy gut will prevent the transmission of a reasonable load of most lectins into the bloodstream. Drugs and diseases take a heavier toll on our modern GI tracts than lectins ever have. People on pain killers routinely end up in hospital with internal bleeding.

    Jason-“MORE often, lectins cause random mutations in cells, vastly increasing the occurence of cancer cells beyond the ability of your weakened immune system to keep up?.

    What is the supposed mechanism by which sugar binding proteins cause mutations in DNA? Different lectins in vitro have been shown to both stimulate cell division and to suppress it. Our old friend Wheat Germ Agglutin even works in a whole animal model:

    “The lectin wheat germ agglutinin (WGA) inhibited the growth of MM46 ascites tumors in vivo as effectively as syngeneic antitumor antiserum?.
    http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=6688401&dopt=Abstract

    Change in crops

    You asserted before that lectins are somehow an essential part of a seed and cannot be bred out. I just found a paper in which researchers bred lectin and phaseolin free beans from pre-existing strains. There was no change in seed viability, just a simple increase in free amino acids. Just like low phytate corn. These compounds help the plant somehow, possibly hinder the human somehow, but most certainly are not essential for plant viability.

    Suppression of phaseolin and lectin in seeds of common bean, Phaseolus vulgaris L.: increased accumulation of 54 kDa polypeptides is not associated with higher seed methionine concentrations
    Mark D. Burow1, 2, 3, Paul W. Ludden2 and Fredrick A. Bliss. Molecular Genetics and Genomics 1993 431 - 439

    Retention of some lectins in most crops therefore shows that any health detriments aren’t large enough to outweigh any positive roles in crop protection from insects/diseases.

    Change in humans

    Firstly Paleolithic humans did commonly consume legumes, grains and other sources of starches. Hunter gatherers are too diverse to make many generalizations about.

    http://www.hort.purdue.edu/newcrop/proceedings1996/v3-228.html

    “Although Australian plants generally produce small seeds they are produced in large quantities. In arid Australia, seed supply is widely available, somewhat predictable and dependable (Flood 1990). These plant products form the dietary staple in that they represent greater than 50% of the total diet and often would constitute 70% to 80%. Hiatt’s data compiled from several sources, and describing the proportions of hunting, gathering and fishing performed by various indigenous peoples, lists three central Australian linguistic groups, the Dieri, Arrernte, and Walpiri (Hiatt 1978). In all three cases, 70% of the diet consists of gatherable foods and 30% from hunting?.

    This is interesting given that central Australian Aborigines seem to suffer the most effects from a modern industrial diet, despite eating grains and legumes for tens of thousands of years.

    Gathering staple wild grains is also known to occur alongside cultivation of others, illustrating how hunter-gathering gradually converts into sedentary agriculture.

    http://en.wikipedia.org/wiki/Ojibwa
    “Most Ojibwa, except for the Plains bands, lived a sedentary lifestyle, engaging in fishing, hunting, the farming of maize and squash, and the harvesting of Manoomin (wild rice)?

    Chestnuts are another starchy staple of native Indians and probably also were to paleo Europeans. Chestnuts, like most nuts, are also rich in lectins.

    Changes in Humans

    Humans do show evidence of substantial selective evolution specific to where they live since the advent of agriculture.

    http://news.nationalgeographic.com/news/2006/03/0308_060308_evolution.html
    “Signs of recent evolution by natural selection are widespread across the human genome, experts say. Genome researchers at the University of Chicago have identified more than 700 regions in human DNA where apparently strong selection has occurred, driving the spread of genes linked to a broad range of characteristics. “These are very recent events—within the past ten thousand years,” said Jonathan Pritchard, a geneticist whose laboratory team conducted the study. The results suggest that humans in different regions have continued to adapt in numerous ways to both environmental changes and cultural innovations. Many of the genetic changes Pritchard’s group detected came during or after the emergence of agriculture, beginning about 10,000 years ago, and long after the formation of modern human populations. Some of the genes most strongly affected by selection were those associated with skin color, bone structure, and the metabolism of different foods.?

    Genomic changes underlie even more responsive epigenetic changes which occur within and across generations in response to their entire environment. For a good overview see:

    http://www.ehponline.org/members/2006/114-3/focus.html
    “As one example, the researchers found four times as many differentially expressed genes between a pair of 50-year-old twins compared to 3-year-old twins?

    Jason-“Yes, but a primate without eyes is no longer a primate, since primates are kind of defined by their binocular vision?.

    I’m sure all our blind brethren would be delighted to know you no longer consider them human. Does this mean that the lectin free bean is no longer a bean? Does that mean it is OK to include in the paleo diet?

    The Price of Success?

    Jason-“Our wheat-based diet is the cause of nearly all of our diseases. Hell, you don’t even get lice unless you eat wheat?.

    OK-Where are the statistics on cancer in wheat avoiding coeliacs?

    http://bmj.bmjjournals.com/cgi/content/full/329/7468/716

    “People with coeliac disease have modest increases in overall risks of malignancy and mortality. Most of this excess risk occurs in the year of follow up after diagnosis. People with coeliac disease also have a noticeably reduced risk of breast cancer.?

    That is pretty inconclusive for such an important carcinogen. Fair enough their rates are mostly higher in the year after diagnosis when they may still be suffering from wheat. But shouldn’t their rates drop after that? And not just for breast cancer?

    Lice have been on people for long before the Paleolithic.

    http://www.continuitas.com/arch2004.html
    “One of the two louse lineages has a worldwide distribution and appears to have undergone a population bottleneck ca. 100,000 years ago along with its modern Homo sapiens host?.

    Increased human density leads to increased infectious disease (even lice!). Many chronic diseases, which are also aggravated by diet, are probably primarily infectious in origin. Unlike the lectin theory the mechanism is well established- viruses insert into genomes, leading to massive mutations, and highjack the cells machinery to encourage proliferation. Other viruses and bacteria contain peptide sequences that cross-react with those of endogenous tissues, potentially stimulating autoimmune disease.

    Cancer-Cervical (HPV), leukemia (feline model is well known), breast (HHMMTV http://news.bbc.co.uk/1/hi/health/3159593.stm), prostrate, liver (flukes). The cervical case is well proven, and the others have the same epidemiological hallmarks HPV did a few decades ago.

    Type 1 Diabetes (cocksackie B)

    Even “obesity?. http://archives.cnn.com/2000/HEALTH/07/28/fat.virus.ap/

    Civilisation also increased viability of more obvious epidemic diseases and of vector borne diseases.

    Agriculture also leads to a more regular food supply. Continuous exposure to the same set of food derived allergens and antinutrients increase the chance of allergic reactions or overwhelmed detoxification mechanisms. A regular food supply leads to continuous operation of the GI tract and lack of periods to heal the endothelium and remineralise the teeth.

    Jason-“The result is that you’ll never find a forager with cancer. We’re still looking for one?.

    Firstly, cancer rates weren’t recorded until the 18th century in Europe. Secondly the opportunities to detect cancer in hunter-gatherer societies are highly limited. Thirdly, and assuming the apparent lack of cancer in HG societies is related to diet and not infectious disease, then it could either be from grain/legume staples causing cancer, or from grain/legume staples displacing other foods that protect against cancer. You need a way to differentiate the two. Modern and agriculturalist diets are certainly lower in cancer preventing nutrients, and this has only been exacerbated in the west in the last fifty years by increased processing of food and a decrease in the total food intake on taking up a more sedentary lifestyle.

    Extension of observations on the lifestyles of modern day hunter-gatherers like the !kung to all Paleolithic peoples is highly dubious. The hunter-gatherers that persisted the longest were generally in the most marginal climates and at the lowest population densities. The confounding effects on the dynamics of infectious diseases and on sociological impacts are enormous.

    Jason-“When Richard Lee visited the !Kung and recorded them for working 2 hours a day, ………That leaves less time for reproduction. The birth rate dips. That’s all that’s needed to curb population size?.

    It takes 22 hours a day to get pregnant? I read somewhere the !kung females actually had very low fecundity due to persistant herpes infections but cant find the link. This link
    http://courses.washington.edu/anth457/repstrat.htm
    suggests the burden of collecting nuts and carrying young children is what really limits kung fecundity. Having children more frequently limits their ability to forage, hence leads to higher child mortality.

    Jason-“Conceivably, but why on earth would a forager be interested in defending territory? No examples exist in the ethnographic record, and there’s no archaeological evidence, either.?

    European paleolithic cave art shows tribal warfare. Records and stories of Australian aboriginal tribes fighting are also common. Foragers must protect the rights to their territory, as it is the source of the sustenance. I recall reading cases of one tribe selling out another with the appearance of Europeans too.

    http://www.archaeology-safaris.co.uk/gn/violence.htm

    “Within “traditional” Aboriginal mythology there are some
    very violent accounts of war-like skirmishes between Aboriginal groups?.

    http://unauthorised.org/anthropology/anthro-l/november-1994/0034.html

    Jason-“It does vary from region to region, but as we’ve discussed elsewhere, the quality of life for foragers in the worst environments imaginable (the ones they occupy now) is superior to the very best agricultural lives possible (namely, ours). Their life expectancy is only a little lower, but they suffer less disease and live healthier lives, with less work and stress?.

    Shane-Worst and best environment only refer to the amount of food available. As I have already suggested it seems that while HGs in hostile environments can have relatively stable low disease and low conflict lifestyles though that doesn’t automatically mean that HGs in rich environments can choose to avoid these problems. I reckon larger homogeneous and sedentary societies are a plausible mechanism to limit violence and conflict in these richer regions. Relying on a smaller, richer parcel of land for your sustenance gives you more incentive to plant your own crops and tend your own animals.

    Shane-“Examples of longevity and health in places such as Bama in China provide an interesting counterpoint?.

    Jason“How so? That’s almost certainly a genetic factor when it’s such a localized “hot spot.”

    Shane-So these people have a form of agriculture to which they are perfectly adapted? And other examples in Sardinia and Okinawa suggest it wasn’t a one off fluke? Does this mean we can live long and healthy agricultural lifestyles if the details are right? Oddly enough these are all small, geographically isolated societies, so population density and its links to disease and conflict appear to be a major factor.

    Shane“Wheat is cheap energy. An apple is cheap energy too..?
    Jason“No, what I meant by cheap energy is that it’s easily broken down into energy by the body…… The human body is really better equipped to using animal fat for energy than plant carbohydrates?. ????????

    Shane-Readily available energy can be very useful, as can readily stored. Our body’s ability to handle either is however limited. We are omnivores that excel in eating pretty much anything, but I’m sure different people have different preferences of energy source mix.

    Shane-?In some ways the current paleo diet philosophy is just an echo of the low fat fad of the 1990s?.

    Jason-“Considering that the particular one I’m following advocates more fat, I’m not sure I follow. The “low fat” fad of the 90s was based on the idea that reductionism was adequate to understand nutrition. Paleo assumes that it isn’t sufficient, so we should just eat the things that worked for two million years?.

    Shane- But people have eaten just about everything except modern industrial food for millennia. Paleo is reductionist in the way it demonises broad classes of compounds like lectins without acknowledging the complexities and unknowns still remaining. I guess behaviorally it is much easier to get people to eat none of something than just less of it.

    Jason-“That is true. Do anything, no matter how stupid, for 10,000 years, and you will develop some adaptations for it, however meager. Unfortunately, the adaptations still are nowhere close to catching up with what a monumentally stupid idea it was in the first place?.

    Shane-I would again point out Bama, Okinawa, Sardinia again. And the evidence that human metabolism genes have been under strong selective pressure for the last ten thousand years.

    Jason-“The Neolithic Mortality Crisis is one of the most well-established facts of the archaeological record. When populations pick up agriculture, their statures drop by a good foot or more, and their life expectancy is generally halved–within a generation. Read “Disease and Death at Dickson’s Mounds.”

    Shane-From what I can find you were definitely exaggerating here regarding life expectancy. Eg:
    http://72.14.203.104/search?q=cache:dnddMuoC1wwJ:anthropology.lbcc.edu/handoutsdocs/mistake.pdf+Disease+and+Death+at+Dickson%27s+Mounds&hl=en&ct=clnk&cd=20&client=safari
    “Life expectancy at birth in the preagricultural community was about twenty-six years,” says Armelagos, “but in the postagricultural community it was nineteen years.?

    Dicksons mound is an illustrative example of the possible costs of a rapid change in diet through agriculture introduced from elsewhere. In fact we see similar results where agriculture is introduced from a different area in a relatively short amount of time, such as in Scandinavia and Ireland.

    Chronic trans-generational malnutrition, especially when combined with the extra disease burdens of sedentary societies, has always seemed like a bit of a puzzle to me when it is balanced by increasing population size nonetheless. An animal’s energy balance year on year is an incredibly delicate thing….a few percent too much and you become obese, a smidge too little and you become malnourished enough to stop breeding or outright die.

    I think the problem actually comes from the storable nature of grains. Since they were essentially the first fungible form of energy humans ever produced they allow trade, reallocation and taxation. This provides a perfect mechanism whereby the incredibly fine line between malnutrition with sustainable reproduction and malnutrition with decreasing populations can be followed. Doing this would have been incredibly risky for the elites. As down trodden as the workers were the elites would not want to push them far enough to die off or revolt. Perhaps the workers made a reasonable decision between the secure certainty of almost starving versus the insecurity of an anarchic society. I should point out too the elites weren’t using the grain to buy ivory backscratchers. It was probably spent on the military to defend the society as a whole from other societies, and on public works. The analogy of an elite as a parasite isn’t all that insulting when you understand the essential role of parasites in ecology. They are the single most important element of an ecosystem for maintaining sustainable population levels, and they live in a state of perpetual dynamic tension with their hosts, experiencing anything but an easy life.

    Shane-?Papua new guinean agriculture, and to a lesser degree traditional asian agriculture, both appear to be sustainable as they place a greater emphasis on nutrient recycling?.

    Jason-“Bollocks. China’s on the verge of breakdown right now, too. New Guinea technically falls into the realm of horticulture, not agriculture?.

    Shane-China is facing ecological collapse from modern green revolution agriculture. Traditional agriculture maintained comparable yields year after year for thousands of years, weather permitting. New Guinean cultivation practices are too diverse to put any one label on.

    Jason-“Again, have you read any of the other theses? That was largely true in the past. Agricuture traded long-term sustainability for short-term competitive advantages, allowing it to wipe out all the other alternatives right before its clock was up and it imploded in on itself. Once that implosion–collapse–begins, the rules reverse themselves. Sustainability becomes the key to survival, and the opportunity to live as a forager opens up. Indeed, it becomes the most strategy with the greatest likelihood of success?.

    Shane- I agree that a foraging lifestyle will become more viable in some areas as our post petroleum societies collapse. It was apparently the difference between life and death in North Korea when its petroleum supply was severed. But it depends a lot on the details whether a society will go down the rejuvenated agricultural route like Cuba, ie climate, soil, current infrastructure and skills, rate of petroleum depletion. I cant see the Papua New Guineans changing the way they do things one iota…for them industrialization and collapse of the rest of the world will pass by without any great impact other than a few new diseases to get used to.

    I have been meaning to point out as well that the distinction between agricultural and HG way of life is largely arbitrary. All humans modify their environments, especially their food sources, for better or worse. Even the most harmonious HG changes the abundance of other organisms through their eating habits. That’s why buffalo and passenger pigeons became super abundant after the north American natives were wiped out by disease. The same thing happened with kangaroos and koalas in Australia (the latter were part of a booming fur hat industry in the 19th century). HG people also engage in low level horticultural practices….Australian aborigines routinely replanted the top third of dug yams. Is eating a fruit and depositing the seed “natural” because we don’t need to use our hands to do it?

    The Price of Change

    Shane-“How would you assure people thinking of going on a paleo diet that it is perfectly safe? Are there any actual clinical studies completed yet??

    Jason-“Our entire civilization is built on grain, so there’s a certain amount of vested interest in “poisoning the well,” as it were. You can find people arguing that paleo is bad, just like you can find dairy farmers lobbying the FDA to make milk one of the four basic food groups. But you can’t argue with evolution, and most studies confirm that paleo’s a good idea. In most studies, it’s taken as a given. I linked to quite a few PubMed citations in “Going Paleo.”

    The references you linked are:

    1- Plant-animal subsistence ratios and macronutrient energy estimations in worldwide hunter-gatherer diets.
    Cordain L, Miller JB, Eaton SB, Mann N, Holt SH, Speth JD
    A current ethnographical survey of hunter gatherers comparing % protein/carb/fat.

    2- The critical role played by animal source foods in human (Homo) evolution.
    Milton K.
    Comparitive anatomy of human and primate guts

    3- Reducing the serum cholesterol level with a diet high in animal fat.
    Newbold HL.
    A small clinical study where hyper-allergic patients got most of their energy from beef tallow, showing a reduction in serum cholesterol. I cant access the whole article so am unsure if this really qualifies as paleo.

    4- The paradoxical nature of hunter-gatherer diets: meat-based, yet non-atherogenic.
    Cordain L, Eaton SB, Miller JB, Mann N, Hill K.
    Another ethnographical study on HG diet proportions

    5- Cardiovascular disease resulting from a diet and lifestyle at odds with our Paleolithic genome: how to become a 21st-century hunter-gatherer.
    O’Keefe JH Jr, Cordain L.
    A review on the paleo diet. Mostly epidemiological and ethnographic evidence cited.

    6- Paleolithic vs. modern diets–selected pathophysiological implications.
    Eaton SB, Eaton SB 3rd.
    Another historical review

    7- [The carbohydrate theory][Article in German]
    Lutz W.
    Another historical review.

    So indeed the benefits of the paleo diet for modern western humans are “taken as a given” without substantiating evidence. So I take it then that no clinical trials of a strict “Paleolithic? diet have been conducted? Not even preliminary ones? How do we really know the long term effects on a western industrial human of changing to the paleo diet rapidly half way through their life? Why wouldn’t a step back to an idealised pre-industrial agricultural diet be a safer bet?

    For the record I have no concerns about the paleo diet in theory, just as I have no concerns about a well balanced agricultural diet. However as an advocate the burden of proof must rest with you to demonstrate that it has been proven to not have any long term effects when it is applied in our current industrial lifestyle.

    My main concerns are that the majority of paleo style, but nonetheless industrial, ingredients available to people may not genuinely reflect their supposed wild constitutions. Eating 60% wild meat may well be fine, but eating 60% industrial meat could have many unexpected consequences.

    Comment by Shane — 27 March 2006 @ 8:55 PM

  180. Speaking of mental illness, pretty everybody has probably known somebody who was or is a “control freak”. That’s the pop-culture name for Obsessive Compulsive Personality Disorder (a different thing from OC Anxiety Disorder, where people wash their hands fifty times a day and things such as that). I recently had the realization that OCPD is simply the toxic, destructive mindset of civilization writ small into an individual human mind. Read this article and see if you don’t agree:

    http://www.ocdonline.com/articlephillipson6.php

    Comment by Thomas Rondy — 27 August 2006 @ 11:27 AM

  181. Candidiasis is another of the growing diseases of civilization. Sugar is the major cause of it, but I wouldn’t be surprised if wheat gluten contributes mightily as well.

    Comment by Thomas Rondy — 27 August 2006 @ 11:44 AM

  182. Jason wrote:

    “I myself have dined on raw meat and raw tree bark, but never raw grains. Eat unprocessed grain, and you will die, regardless of your upbringing. Drink milk, and if you’re not one of those “lucky” few with a particular mutation, you will get sick, regardless of your upbringing.”

    I’m more with ya than agin ya,
    Jason. Overall, your articles are
    pretty spot-on. But this (above)
    comment of yours is rather over
    the top. Most people drink milk
    and eat dairy products without
    serious harm, or ANY harm to speak
    of. Sorry, but that’s a fact. And raw
    grains are quite edible when
    sprouted or soaked. In fact the
    problems with lectins and gluten
    are greatly ameliorated thereby.
    And indeed it is likely that our
    paleo forebears ate IMMATURE
    (green) “grains” as they found them,
    without harm and with benefit.

    Bottom line: there’s no point
    setting up grains and dairy as the
    Satanic Foods of Civilization Which
    WILL KILL You — Every One Of You!

    The extremeness is amusing and has
    great shock value, but is ultimately
    not (pardon the expression) very
    nourishing.

    Cheers!

    Comment by alan2012 — 5 December 2006 @ 11:45 PM

  183. PS: Post #179, from Shane, is one
    HELL of a post! It should not be read,
    it should be studied! Would you not
    agree, Jason?

    Comment by alan2012 — 5 December 2006 @ 11:52 PM

  184. [quote]It takes 22 hours a day to get pregnant?[/quote]

    The larger context provides your answer:
    [quote]When Richard Lee visited the !Kung and recorded them for working 2 hours a day, it was in the midst of the worst drought in living memory, in the Kalahari desert. While their Bantu neighbors starved by the thousands, they complained of how long and hard they had to work–but no one went hungry. All the same, they were working more than usual.[/quote]

    Jason’s statement about less time for reproductive activities (ie sex) is directed towards the !Kung’s Bantu neighbors, not the !Kung themselves.

    Comment by jhereg — 6 December 2006 @ 11:27 AM

  185. Most people drink milk and eat dairy products without serious harm, or ANY harm to speak of.

    That’s incorrect. Most people are “lactose intolerant,” which is a fairly bizarre term, since it makes the common condition sound pathological. What’s unusual is lactose tolerance. What’s more, wheat and dairy are the most common food allergies; they’re also almost completely undiagnosed, because they’re simply accepted as “normal,” the same way schistosomiasis is normalized in Africa (in some African cultures, incidence is so universal that where the first menstruation is the sign of adulthood for women, for men, it’s the first appearance of blood in the urine). In actual fact, almost everyone has severe allergies to these foods, and accepts a state of permanent sickness as the status quo.

    PS: Post #179, from Shane, is one HELL of a post! It should not be read, it should be studied! Would you not agree, Jason?

    No, I wouldn’t. I find his argument entirely unconvincing, but as I mentioned in comment #174, wading through such large volumes of specious argument and sophistry is far too time-consuming and aggrevating to continue.

    Comment by Jason Godesky — 6 December 2006 @ 11:54 AM

  186. Jason:
    [quote]Eat unprocessed grain, and you will die[/quote]

    alan2012:
    [quote]And raw
    grains are quite edible when
    sprouted or soaked.[/quote]

    I, personally, would consider sprouting ‘processing’. True, it’s the best possible processing, not the least of which because it’s so energy-efficient.

    Jason:
    [quote]Drink milk, and if you’re not one of those “lucky” few with a particular mutation, you will get sick[/quote]

    alan2012:
    [quote]Most people drink milk
    and eat dairy products without
    serious harm, or ANY harm to speak
    of.[/quote]

    I guess that really depends on how you’re defining “most people”. If by “most people” you mean North Americans, alan2012’s statement is probably true (or relatively true, there’s a lot of opportunity here to debate the effects dairy has on osteoporosis, obesity, and the many different flavors of heart disease). But if you mean “most people” in the world, then Jason’s statement is more accurate, since the majority of folk in the world are lactose intolerant.

    Comment by jhereg — 6 December 2006 @ 11:56 AM

  187. You said that protein and fat are never turned into fat, only carbs. That is incorrect. The body can convert all of those things into fat. In fact, dietary fat is stored in fat cells very easily. It only takes about 2-3 calories to store 100 calories of fat as fat. Whereas it takes about 20 calories to convert carbs and protein to fat. Carbs, protein, and fat are made of carbon, hydrogen, oxygen, and nitrogen ( for protein). Do you really think your body can’t just move molecules around and make something else? Your body can also make glucose from protein and fats. It’s called gluconeogenesis.

    I think also there has been research done where they excise some fatty tissue and do analysis on it. They can tell what kind of fat the person eats based on the kind of fat in their fat cells.

    Comment by Marcy — 13 December 2006 @ 3:30 PM

  188. That is incorrect.

    No it’s not.

    Do you really think your body can’t just move molecules around and make something else?

    Yes; the body has to break foods down into molecules. Its own fats, cholesterol, and proteins are created inside the body itself. Your body can’t just use cow fat for your own; it needs to be broken down.

    Comment by Jason Godesky — 13 December 2006 @ 3:39 PM

  189. That is incorrect.

    No it’s not.

    Do you really think your body can’t just move molecules around and make something else?

    Yes; the body has to break foods down into molecules. Its own fats, cholesterol, and proteins are created inside the body itself. Your body can’t just use cow fat for your own; it needs to be broken down.

    Comment by Jason Godesky — 13 December 2006 @ 3:39 PM

  190. Oh, and tooth decay is not caused by acid from sugar meeting amylase. It’s caused by the bacteria colonizing the mouth. The bacteria ferment glucose and produce acid.

    Comment by Marcy — 13 December 2006 @ 3:41 PM

  191. Oh, and tooth decay is not caused by acid from sugar meeting amylase. It’s caused by the bacteria colonizing the mouth. The bacteria ferment glucose and produce acid.

    OK; the basic point remains, that a highly sugary diet is at odds with the chemistry of our mouths, hence, dental problems.

    Comment by Jason Godesky — 13 December 2006 @ 3:44 PM

  192. “Yes; the body has to break foods down into molecules. Its own fats, cholesterol, and proteins are created inside the body itself. Your body can’t just use cow fat for your own; it needs to be broken down.”

    Yeah, I know that. What I mean is that since fat, carbs, and proteins are made of carbon, hydrogen, oxygen, and nitrogen, then your body can manufacture any one of them from any other one (after it breaks them down, of course). Your brain needs glucose as fuel. If you eat all protein and no carbs, your body will take the carbon, hydrogen, and oxygen from the protein (get rid of the nitrogen) and make glucose from it.

    And as for that link, it says: “Moreover, within the United States, a substantial decline in the percentage of energy from fat during the last 2 decades has corresponded with a massive increase in the prevalence of obesity. Diets high in fat do not appear to be the primary cause of the high prevalence of excess body fat in our society, and reductions in fat will not be a solution.”

    First of all, I think it’s debatable whether or not Americans are truly eating less fat. And also, eating way more calories than you need OF ANYTHING will cause a person to gain wait. So, yeah, eating lots of carbs will put weight on a person, if they’re eating more than they need. But that doesn’t change the fact that dietary fat still is put in storage easier than carbs and protein. Fat is broken down into fatty acids and glycerol. There are different types of fatty acids, and different fats have different proportions of fatty acids. For example, animal fat is made up of mostly stearic acid. Olive oil is mostly oleic acid. The storage form of fat is triglycerides. Three fatty acids bound with glycerol. It’s a lot easier for your body to take the fatty acids from the dietary fat and combine them for storage. So yeah, if they cut a hunk of fat out of your body, they can see what types of fatty acids make it up and determine what types of fat you ate. As for what type of fatty acid carbs and proteins are converted to, I don’t know.

    Comment by Marcy — 13 December 2006 @ 4:09 PM

  193. And also, eating way more calories than you need OF ANYTHING will cause a person to gain wait.

    That’s precisely the wrong attitude. I could eat a few million calories of fiber every day and I’ll still starve to death. What you’re eating is far more important than how much.

    But that doesn’t change the fact that dietary fat still is put in storage easier than carbs and protein.

    It’s been a while since I studied this in depth, but that doesn’t sound right. There was certainly a time when that was believed, but I remember reading quite a few papers that turned that over. But this was some time ago, and I’d be hard-pressed to find those articles again.

    Comment by Jason Godesky — 13 December 2006 @ 4:13 PM

  194. Hey –

    Marcie: Oh, and tooth decay is not caused by acid from sugar meeting amylase. It’s caused by the bacteria colonizing the mouth. The bacteria ferment glucose and produce acid.

    Jason: OK; the basic point remains, that a highly sugary diet is at odds with the chemistry of our mouths, hence, dental problems.

    High sugar/refined starch diets change the acidity and composition of our saliva to make our mouths a better ecology for the bacteria that damage our teeth.

    Taht;s partly why fruit is not nearly so damaging, because the sugars are more complex AND fruit comes with its own complex assortement of acids, etc.

    Janene

    Comment by janene — 13 December 2006 @ 4:32 PM

  195. Jason, very well done,you may save a few lives and convert people to a healthy lifestyle.
    Have been as close as possible to a foragers diet for the past six years with excellent results ie., 68 years old no medical problems or pharma drugs.
    Here is a study that backs you up.

    STANFORD RESEARCHERS FIND CAUSE, POSSIBLE CURE FOR GLUTEN INTOLERANCE
    STANFORD, Calif. – A team of investigators led by Stanford University researchers have discovered the cause and a potential treatment for celiac sprue, an autoimmune disease that leads to an inability to digest gluten, a major protein in wheat, rye and barley products. The disease is estimated to afflict as many as 1 in 200 Americans.
    In the Sept. 27 issue of Science, researchers identify a fragment of gluten called gliadin as the celiac culprit. They showed that this fragment is resistant to digestion and is responsible for the intestine-damaging inflammatory response experienced by celiac patients. They also report the use of a dietary enzyme made by a bacterium that can break down the fragment into harmless bits, suggesting future treatment through dietary supplements.
    “These findings are the first step to giving people with celiac disease real hope for a normal life,” said Chaitan Khosla, PhD, professor of chemistry, chemical engineering and, by courtesy, of biochemistry. Lu Shan, a graduate student in Khosla’s lab, was lead author on the paper. The team included other Stanford researchers as well as a group from the University of Oslo in Norway.
    The lining of the small intestine is normally carpetlike, covered with small protrusions called villi. Celiac disease, however, results in a smooth, pipelike intestine. The reduced surface area keeps the body from absorbing nutrients. Often diagnosed in childhood, the disease can lead to the distended stomach and stunted growth typical of starvation.
    “The only effective therapy for most people is a lifelong gluten-free diet, and that’s fairly restrictive,” explained co-author Gary M. Gray, professor of medicine, emeritus. The diet is essential over the long term both to restore normal intestinal function and to reduce the risk of developing osteoporosis, lymphoma or cancer of the small intestine, he added.
    In the laboratory, Shan simulated the digestive process, exposing gliadin to digestive enzymes in test tubes. She identified a protein fragment made up of 33 amino acids that was resistant to further digestion and whose structure was known to be toxic. Most proteins are broken down into small peptides of between two and six amino acids or into single amino acids. She then repeated her study in rats and again in test tubes using tissue taken by biopsy from patients undergoing unrelated medical procedures. “Even with prolonged treatment (exposure to intestinal enzymes), the peptide doesn’t lose the ability to induce the inflammatory response,” Shan said.
    When they looked more closely at the fragment, Shan and her colleagues found that it was made up of even smaller fragments already known to induce human T-cells to attack the intestine. The team in Norway then measured the ability of the gliadin fragment to induce autoimmune activity. “The response by T-cells was about 10 to 20 times higher than the smaller peptides themselves,” Shan said.
    Because the fragment is rich in the amino acid proline, investigators reasoned that a peptidase (an enzyme that breaks down proteins) with the ability to digest proline-rich chains might be able to break down the gliadin fragment, rendering it harmless to celiac patients. They have now shown that this is the case in test tubes and in rats. Because there are no animal models of celiac disease, testing this approach in humans is a long way off and will require further preclinical work, Khosla said.
    “We think that this mode of therapy – peptidase supplementation – may offer hope in treating celiac sprue eventually, and we’re going to test this hypothesis.”

    http://mednews.stanford.edu/releases/2002/september/gluten.html

    Comment by Ray — 11 August 2007 @ 6:53 AM

  196. I meant to highlight these few words from the above post.

    “She identified a protein fragment made up of 33 amino acids that was resistant to further digestion and whose structure was known to be toxic.”

    Comment by Ray — 11 August 2007 @ 6:58 AM

  197. Oh geez.

    I’m really interested in this subject matter, but I shook my head both at the jab at saturated fat in the original article and at some of the comments here.

    Saturated fat, according to Nina Planck, makes up about fifty percent of our cell membrane material. It is also necessary for the production of hormones and nerve cells, among other vital structures and functions. We were obviously getting plenty of it from animal sources, so I am unclear as to why it is a problem in industrial diets. Particularly as cholesterol isn’t a problem either–researchers actually paying attention to the data seem to think it’s a repair molecule, and that if someone has high cholesterol it’s because they have latent heart disease, not because they’ve been eating too much fat.

    As for us needing sugar and starch carbs because the brain needs glucose, that is just silly. We don’t need huge amounts of sugar and starch; in fact, they make us sick! Furthermore, the foods highest in those substances–grains and legumes–contain phytic acid and protease inhibitors which lead to such fun conditions as osteoporosis (and probably explain why people shrank in the Neolithic!). Additionally, soy contains goitrogens which damage the thyroid. Considering that we can manufacture all the glucose we need from fruits and vegetables and organ meats (yes, organ meats contain carbohydrate) and that the brain is more efficient on ketone bodies than it is on glucose (it needs some of the latter, but not a lot), a meat-and-vegetables diet is not half so unhealthy as grain advocates seem to think.

    It’s certainly more healthy for me. I became and remain obese on grain foods. If I cut them out of my diet, my blood sugar completely normalizes and I lose excess weight. My teeth also remain cleaner and my mood more even. That’s enough evidence for me, and any so-called “expert” suggesting my diet is unhealthy ought to have his head examined.

    The only thing grains and beans have got going for them is efficient energy storage that does not go bad immediately. Implementing an efficient agricultural system that did not overly disturb the topsoil and which took vegetable seasonality and animal husbandry into account would be far better for us than continuing to plow and plant cereal crops every year only to have to supplement with nutrient pills or else die from chronic disease.

    Comment by Dana — 8 September 2007 @ 1:42 PM

  198. Aaand… I read that Paleolithic Diet link and I don’t see any difference between it and the Atkins diet, minus the low-carb convenience foods–and Dr. Atkins initially counseled against using convenience foods at all, and they’re still not necessary to the diet.

    A little while ago I was reading a Peak Oil-related website and they were discussing what sort of diet one could sustainably follow post-Peak, and it was suggested that vegetable intake is very important, and then the author took a swipe at Dr. Atkins. Except… vegetables ARE allowed on Atkins. In fact they’re the primary source of what carbohydrates are allowed early in the diet.

    I don’t mean to take over your blog with shilling for a diet plan, but accuracy is important, and I’ve taken to correcting inaccuracies because unlike the people I read who say things like this, I have actually read his books. Considering they are generally available at public libraries, there really isn’t any excuse for anyone else to not have done so if they should feel moved to discuss the diet at all.

    I also owe Dr. Atkins something of a debt because thanks in part to his work, I can now read something like this blog essay and nod my head in agreement instead of saying, “What IS that nutcase yammering on about?” So there. :P

    Speaking of dietary lectins, you should read Peter D’Adamo’s works about his “blood type diet.” He says a lot of the same things you’re quoting here. And now I wonder whether the folks in Sweden who mutated to keep making lactase are type B, as D’Adamo thinks the gene for blood type B is clustered with other genes that make it easier for type Bs to digest milk, especially fermented milk. Very interesting.

    Comment by Dana — 8 September 2007 @ 2:25 PM

  199. Why is saturated fat considered a component of affluent malnutrition? I was given to understand it is an important component in cell walls and that the body will synthesize it from the other fats we eat if it can’t get it directly from the diet.

    You’re going to get a lot of saturated fat in animal foods, so claiming that saturated fat’s a form of malnutrition and then extolling it as a part of the paleo diet is kind of weird.

    I feel better eating high-fat than I do eating high-carb. I think the studies “proving” saturated fat is harmful did not isolate that saturated fat intake from a concurrent high-carb, especially high-sugar, intake.

    Comment by Dana — 19 October 2008 @ 2:40 AM

  200. Some other things…

    Atkins as it was originally practiced, even in the last version of the book put out before Dr. Atkins’s death, called for something very like the Paleo diet in terms of what foods were allowed. Don’t confuse the induction phase of the diet with the entirety of the diet. (Even if you did, induction isn’t that different from Inuit dietary practices, at least inasmuch as the food groups involved are concerned–the food prep methods, of course, are another matter.) I felt good on Atkins and I would imagine I’d feel just as good on paleo.

    The milk question is an interesting one. The Weston Price folks claim that many of the physical problems suffered by the average milk-drinker disappear when that person drinks raw milk instead of pasteurized. I would be interested to know if that’s the case. They also claim, which I find easier to believe, that lacto-fermented milk is better still. Not only is most of the lactose digested by lactobateria but the casein and fat are broken down into forms more digestible by the human body as well. This is why yogurt was such a hit in the Middle East–they almost never drank fresh milk. And if you look at the traditional diets of Scandinavian people you will find a preponderance of lactofermented milk over fresh as well.

    I think there may even be a difference in outcomes with grain foods based on preparation methods. I wish I had some way to find out. Traditional sourdough bread, for instance, is more broken down structurally than modern bread made with baker’s yeast. I wonder if this affects the lectins as well. It certainly affects phytic acid and probably affects some or all of the enzyme blockers. Then there’s Ezekiel and Essene breads which are made from sprouted grains, which definitely eliminate phytates and blockers by definition.

    For someone who is trying to transition to a more sustainable diet and one not so hard on the human body I could see where taking up home cooking and practicing fermentation methods might make a big difference. This would be especially useful to people who don’t want to give up what they view as “traditional” fare just yet (i.e., bread).

    Comment by Dana — 19 October 2008 @ 2:50 AM

Close
E-mail It