I have already had a few commenters direct me to “Noble or Savage?,” the article from the Dec. 19 Economist magazine. The article has not raised my low opinion of this periodical. As Kenneth Boulding so correctly assessed, “Anyone who believes that growth can go on forever in a finite world is either a madman or an economist.” You may recall that The Economist teamed up with Shell some years back gave us the absurdist essay contest question, “Do we need nature?” (Derrick Jensen gave perhaps the best answer: “It’s insane.”) But this most recent offering presents precisely the kind of article I have, unfortunately, become all too familiar with—overblown rhetoric based in faulty evidence presented deceptively. Nothing new appears in the article that we haven’t spent pages debunking here in past articles, but we can hardly expect casual readers to have read that much of the Anthropik backlog. Since I have no doubt that many will continue to post links to this inane article mistaking its argument for a cogent one, I offer this piece. It has little new for regular readers; instead, I have simply collated my previous responses to the evidence misrepresented by The Economist article, so that it appears all in one place.
Several archaeologists and anthropologists now argue that violence was much more pervasive in hunter-gatherer society than in more recent eras. From the !Kung in the Kalahari to the Inuit in the Arctic and the aborigines in Australia, two-thirds of modern hunter-gatherers are in a state of almost constant tribal warfare, and nearly 90% go to war at least once a year. War is a big word for dawn raids, skirmishes and lots of posturing, but death rates are high—usually around 25-30% of adult males die from homicide. The warfare death rate of 0.5% of the population per year that Lawrence Keeley of the University of Illinois calculates as typical of hunter-gatherer societies would equate to 2 billion people dying during the 20th century.
Lawrence Keeley’s War Before Civilization: The Myth of the Peaceful Savage really represents the first work in this trend, and continues to provide the foundation for the later work of LeBlanc, Walker and Knauft. Much of Keeley’s evidence centers around archaeological evidence of fortified villages around the world. Of course, that merely proves that food producers engaged in warfare; Keeley’s farther-reaching assertion that warfare is endemic to human nature goes far beyond the evidence he provides of horticultural warfare, and his work has been heavily criticized on this point. Though we have cave paintings back into the Upper Paleolithic 40,000 years ago, it is only about 10,000 years ago, with the invention of the bow, that we see the first cave paintings of groups fighting. We don’t see paintings of people fighting with clubs or even atlatls, but we instead see them fighting first with bows and arrows. Some paintings even portray still-recognizable tactical techniques like flankings and envelopments. It is also at this time that the first skeletal evidence of warfare emerges, with bones showing evidence of violent death, arrow-heads in skeletal remains, and so forth. This is quite late in our history as a species, and once again correlates to the rise of food production.
Keeley tries to expand his case by looking at evidence of violence among modern hunter-gatherers, most notably the Plains Indians. “Keeley finds brutish behavior everywhere and at all times, including among the American Indian. If the number of casualties produced by wars among the Plains Indians was proportional to the population of European nations during the World Wars, then the casualty rates would have been more like 2 billion rather than the tens of millions that obtained.”1 Though anthropologists have learned that modern hunter-gatherers are not “living fossils” that necessarily preserve pre-civilized behaviors when such comparisons illustrate the basic economics of hunter-gatherer life and the pressures it places on sharing and social bonds, Keeley presents modern hunter-gatherers as precisely that when he needs evidence of prehistoric warfare. The Plains Indians are a particularly ironic choice, given the evidence Peter Farb gathers in Man’s Rise to Civilization, As Shown by the Indians of North America from Primeval Times to the Coming of the Industrial State. There, Farb shows that the Plains Indians we know did not exist prior to European contact. They descended from refugees from other Native groups destroyed by the various European epidemics that wiped out 90% or more of North America’s population in the years after 1492, with a new culture assembled around two important European introductions: the re-introduction of the horse as wild herds profligated and filled up the Americas, and guns traded from French fur trappers. The Plains Indians had a post-apocalyptic culture.
Given the trauma of what was essentially the end of the world for Native groups, a surge in violence would be expected. 90% or more of the American population died from epidemic disease. Groups were displaced, and a massive rearrangement of tribal territories racked across the continent like billiard balls long in advance of European settlers. When “the Pilgrims” came to Plymouth, they were aided in setting up their new colony by a Patuxet native named Tisquantum (better known to Europeans as “Squanto”). He was the primary contact for the Europeans on behalf of the Wampanoag Confederacy because he already knew English: he was captured by George Weymouth in 1605, worked for nine years in London, and returned to the New World with John Smith (of “Pocahantas” fame) in 1613, but was dropped off on the wrong part of the continent and kidnapped by Thomas Hunt, but escaped to London. He finally made it back home in 1619, to discover that his home village had been wiped out, probably by smallpox. When “the Pilgrims” arrived, they set up their colony on the ruins of Tisquantum’s home town. Tisquantum’s twisted tale illustrates the enormity of European impact on Native lives long before what we would normally consider the first point of European contact. By the time Europeans came, most places they arrived were already shattered. The archaeological record bears out a significant increase in violence in this post-apocalyptic era.
Researchers examined thousands of Native American skeletons and found that those from after Christopher Columbus landed in the New World showed a rate of traumatic injuries more than 50 percent higher than those from before the Europeans arrived.
“Traumatic injuries do increase really significantly,” said Philip L. Walker, an anthropology professor at the University of California at Santa Barbara, who conducted the study with Richard H. Steckel of Ohio State University.
The findings suggest “Native Americans were involved in more violence after the Europeans arrived than before,” Walker said. But he emphasized there was also widespread violence before the Europeans came.
Nevertheless, he said, “probably we’re just seeing the tip of the iceberg” as far as the difference between violence levels before and after. That’s because as many as half of bullet wounds miss the skeleton. Thus, the study couldn’t detect much firearm violence, though some tribes wiped each other out using European-supplied guns.2
Interview with Derrick Jensen concerning the violence inherent to civilization.
Of course, such an increase does indicate that violence occured prior to any European contact at all. That’s certainly true of the New World states in Mesoamerica and the Andes, but some level of violence should be expected at all times and places. Violence has a place in the natural world. Part of our civilization’s twisted view of the world is its inability to come to terms with violence in a healthy way. There is an inherent violence to animal life—a fact that philosophers have often commented upon. The chopstick has its origins in Confuscius’ notice that stabbing food with a knife is a violent act, and his desire to move away from that. Buddhist desire to do no harm finds its apex in some monks who wear shoes like short stilts, to minimize the possibility of stepping on a bug. Ethical vegetarians often eschew the eating of meat because they do not want to kill anything, so instead, they kill vegetables. Whenever and wherever we try to eliminate the harm we do, we find ourselves running into the essentially violent nature of animal life: we live because others die.
Shamanism is a hunter’s religion. The forager life does not afford the luxury of such self-deceit: it requires the predator to make his peace with his animal nature, including the inherent, inescapable fact of violence. No animal lives, except by the deaths of others. The mediation of life and death in a universe carefully balanced between them is at the heart of everything the shaman does. At the heart of agricultural philosophy, however, is a desire to escape this system: to have life, and never death; growth, and never decay; health, and never sickness. In the end, it is a fool’s dream doomed to failure, but the longer it goes on, the more death, decay, and sickness is needed to balance out the folly. Normally, the forager’s life balances these in small, manageable portions. Civilization distinctly amplifies violence in its unhealthy flight from it.
Keeley cites evidence that the wars of the Plains Indians were, per capita, as costly as the two world wars. He further cites evidence that the !Kung homicide rate rivals that of the inner city. Knauft3 points out that such violence comes from the changing nature of Bushmen life as their old way of life is shattered and they are settled into a new, sedentary lifestyle. Kent4 further points out that Keeley’s statistic is taken out of context. In addition to these criticisms, I would draw attention to the double standard applied to violence in our own civilization, versus how it is applied to “primitive” societies.
Keeley makes the argument that, because horticultural and forager peoples have such low population density, a single homicide could be a war. Of course, Keeley ignores the systematic violence of civilization: he lumps together all occasions of violence in “primitive” societies, and compares them to our own wars, or our own homicide rates, but never the combination. Neither does he take into consideration the toll of our violence professionals: the violence done by police, or incarceration (1 in 37 adults in the United States), or the essential violence of the market and tax system we’re forced into by civilization’s violence professionals. In civilization’s pursuit of a peaceful society, the “Monopoly of Force” has succeeded only in making every aspect of society violent. When we consider fully that the violence Keeley cites is that of an essentially post-apocalyptic society, and that he is comparing it to only a small portion of our own violence, it becomes clear that contrary to Keeley’s case, civilization has vastly increased violence. It has made violence ubiquitous, and it has changed the fundamental nature of it.
No society can ever be fully devoid of violence, but those that aspire to such a goal only become more violent by denying its place in the world. Primitive societies did engage in violence, and without a permanent class of professional killers, it fell to primitive peoples themselves to execute what violence became necessary. Perhaps that is in part why such societies also did so much to limit violence. Contemporary charges against primitive warriors rely on observations of a “post-apocalyptic” society decimated by European contact, ignoring the evidence that violence in these societies has been increased significantly because of the overwhelming impact of European contact. What we do see, however, is ample evidence of means to limit violence—emphasis placed on bravery and intimidation to avoid violence from breaking out, ritual approaches aimed at reconciling enemies, and alternative forms of contesting differences, such as song duels or counting coup. To properly compare the effectiveness of such approaches to our own, we need to take an honest accounting of violence in our own society—wars, murder, violent crime, incarceration, police brutality, and the full impact of our professional violence class. We need to look also to the ubiquitous violence inherent in our social system: the threat of violence that lies behind paying your rent, obtaining your food, and every other aspect of civilized existence. Primitive societies were not devoid of violence, but they did limit it, and it was a much rarer thing. Among them, violence was something that happened. For us, it’s a way of life.
Evolution of Violence
At first, anthropologists were inclined to think this a modern pathology. But it is increasingly looking as if it is the natural state. Richard Wrangham of Harvard University says that chimpanzees and human beings are the only animals in which males engage in co-operative and systematic homicidal raids. The death rate is similar in the two species. Steven LeBlanc, also of Harvard, says Rousseauian wishful thinking has led academics to overlook evidence of constant violence.
Jane Goodall turned our understanding of chimpanzees upside down. From her unprecedented observations, we’ve taken the view of chimpanzees as violent and domineering, a view that’s been used to proffer up the perspective that humans, too, are by nature violent, and that dominance hierarchies are natural to us. But how did Goodall get such unprecedented access? By giving them food. I’ve not read The Egalitarians—Human and Chimpanzee: An Anthropological View of Social Organization, but I do think that Theodore Kemper’s review in The American Journal of Sociology (Vol. 97, No. 6 (May, 1992), pp. 1757-1759) is in need of a handy URL for future reference.
Essentially, Power argues that because human hunter-gatherers and chimpanzees in the wild share the same ecological niche, their social organization is remarkably similar. The qualifier, in the wild, is significant, inasmuch as the dominant paradigm in chimpanzee studies today derives from the later work of Jane Goodall, who reports that the animals are strongly territorial, aggressive, and dominance-seeking. Whereas Goodall’s analysis might support a theory of phylogenetic continuity for similiar, biologically inherent, agonistic qualities in humans, Power’s important contribution is to show that Goodall’s conclusions may rest principally on the “unnatural” environment that Goodall herself created for the apes in order to facilitate observation of their behavior.
When Goodall began her naturalistic studies of chimpanzees in 1960 in the Gombe National Park area of Tanzania, she was a distinctly non-participant observer. After some years of patiently tracking apes over large areas, Goodall discovered that she could lure animals into a more or less permanent prescence around her camp, thereby improving opportunities to observe social interaction, by baiting the camp with supplies of bananas. Indeed, this was an inspired notion. According to Power, it worked too well.
Power maintains that the change that Goodall engineered in the food supply warped the chimpanzees’ conduct and social organization more or less permanently. Power pursues the argument by examining the differences between Goodall’s observations prior to the artificial feeding regimen and the subsequent findings. Goodall herself does not rely much on the results of her early work.
Power argues that, like human hunter-gatherers, chimpanzees in the wild roam widely, rarely confronting each other in direct competition over food. Goodall’s artificial feeding, practiced from 1964 to 1968, introduced direct competition among the apes for the first time. Bunched around the feeding boxes and often frustrated by not obtaining the bananas (which were doled out according to specific schedules), the animals began to engage in more intense forms of competitive, aggressive, and threatening behavior than was known to occur in the wild.
Goodall’s work has been heralded for bringing observations of chimpanzees in the wild. It sounds like that was only made possible by taking away the “wild” part. We’ve discussed in depth elsewhere on this site how the hoarding of food allowed by agriculture allowed elites to emerge, with hierarchies and coercive force to maintain them. Goodall’s observation of chimpanzees have often been used to excuse this order as “natural,” but Power’s work suggests that instead, Goodall’s unconsidered civilized assumptions succeeded in civilizing chimpanzees. [Note: Goodall herself has admitted that this her method of observation biased her results; the shoe seems to fit the other foot: Hobbesian wishful thinking has led academics to overlook evidence of relative peaceful existence.]
The “Overkill” Hypothesis
Returning to hunter-gatherers, Mr LeBlanc argues (in his book “Constant Battles”) that all was not well in ecological terms, either. Homo sapiens wrought havoc on many ecosystems as Homo erectus had not. There is no longer much doubt that people were the cause of the extinction of the megafauna in North America 11,000 years ago and Australia 30,000 years before that. The mammoths and giant kangaroos never stood a chance against co-ordinated ambush with stone-tipped spears and relentless pursuit by endurance runners.
The standard rejoinder when discussing the ecological devastation wrought by civilization and the relatively benign existence of hunter-gatherers is to point out that hunter-gatherers caused extinctions of their own—the extinction of the megafauna at the beginning of the Holocene interglacial, including the mammoths, the cave bear, thee giant hyaena, the giant rat of Majorca, the American horse, saber-toothed cats, diprotodons (giant relatives of the wombats), and an Austrlian, leopard-sized marsupial lion, among many others. These, according to Paul S. Martin’s “overkill” theory, were hunted to extinction by ravenous, Pleistocene human foragers every bit as rapacious in their ecological exploitation as any modern civilization. This is ill-founded anachronism of the highest order.
Overkill theorists take their cues from the extinctions in New Zealand, which was once home to 11 different species of large, flightless birds called moas. Within a few centuries of human habitation, they were extinct. “We do know that human colonists caused extinctions in isolated, tightly bound island settings, but islands are fundamentally different from continents,” says Donald Grayson. “The overkill hypothesis attempts to compare the incomparable and there is no evidence of human-caused environmental change in North America. But there is evidence of climate change. Overkill is bad science because it is immune to the empirical record.”
Another quote from Grayson puts an even finer point on it: “Martin’s [overkill] theory is glitzy, easy to understand and fits with our image of ourselves as all-powerful … It also fits well with the modern Green movement and the Judeo-Christian view of our place in the world. But there is no reason to believe that the early peoples of North America did what Martin’s argument says they did.”
For one thing, while we can certainly understand the extinction of mammoths or bison in terms of human overhunting, what of all the other animals that died off at the same time? Were humans really hunting saber-toothed tigers, and in such numbers as to drive a robust and healthy species into extinction? Were aborigines overhunting the diprotodon, who likely became enshrined in their mythology as the demonic bunyip? We find plenty of mammoth bones at human habitation sites, but none of these other species.
In “Climate Change Caused Extinction of Big Ice Age Mammals, Scientist Says,” written for National Geographic News in November 2001, Hllary Mayell writes:
The overkill hypothesis, Grayson says, rests on five tenets: human colonization can lead to the extinction of island species; the Clovis people were the first humans to arrive in North America, around 11,000 years ago; the Clovis people hunted a wide range of large mammals; the extinction of many species of North American megafauna occurred 11,000 years ago; and therefore, Clovis hunting caused those extinctions.
Grayson disputes several of these tenets.
There is no proof, he said, that the late Pleistocene extinctions occurred in conjunction with the arrival of the Clovis people. “Of the 35 genera to have become extinct beginning around 20,000 years ago, only 15 can be shown to have survived beyond 12,000 years ago,” Grayson said. “The Clovis peoples didn’t arrive until shorty before 11,000 years ago. That leaves 20 [genera] unaccounted for.”
There is also no evidence that the Clovis people hunted anything other than mammoths, he said. Although numerous sites where large numbers of mammoths were killed have been uncovered, no similar sites for any other large mammals have been found in North America.
And while there is no evidence of widespread human-caused environmental change similar to that seen on island settings, there is evidence that animal populations in Siberia and Western Europe, as well as North America, were affected during the same period by climate changes and glacial retreat.
In addition, the primacy of the Clovis mgiration has itself come under serious assault, first with artifacts from Meadowcroft Rock Shelter in Pennsylvania with artifacts dating back 30,000 years. We now have several sites with artifacts dating back similarly to three times the age of Clovis, at Monte Verde, Chile, Cactus Hill, Virginia, and the Topper site on the Savannah River near Allendale, South Carolina. Mitochondrial DNA from Native Americans show a divergence from the Siberian population 20,000 years ago—again, significantly older than Clovis. This makes the idea that the Clovis were the first people in North America increasingly untenable—and if the Clovis were not the first people in North America, that means that people were living in North America well before the mass extinction began, shattering the “Overkill” hypothesis that humans wiped out these species when they entered North America.
That might reconcile Grayson’s point about earlier extinctions, but some genera were going extinct even before that. It appears that bison, at least, were on the decline already when human hunters made their entrance. A report from USA Today says:
A team of 27 scientists used ancient DNA to track the hulking herbivore’s boom-and-bust population patterns, adding to growing evidence that climate change was to blame.
“The interesting thing that we say about the extinctions, is that whatever happened, it wasn’t due to humans,” said the paper’s lead author, Beth Shapiro, a Wellcome Trust Research Fellow at Oxford University. By the time people arrived, “these populations are already significantly in decline and on the brink of whatever was going to happen to them in the future.”
The story written into the bison’s DNA is one of an exponential increase in diversity with herd sizes doubling every 10,200 years. Then, 32,000 to 42,000 years ago, the last glacial cycle kicked in, beginning a lengthy cooling trend. Bison genetic diversity plummeted. A significant wave of humans didn’t appear in the archaeological record at eastern Beringia until more than 15,000 years later, the authors write in Friday’s Science.
That evokes the alternative “overchill” theory—that the megafauna extinctions were caused not by human predation, but by climate change. The very same climate changes that revealed the Bering land bridge, that made it so easy for the first Polynesians and Australians to hop from island to island, and made it possible for humans to experiment with new niches, also created new conditions that some species adapted to better than others.
Humans were not ecological saints, either. We did cause extinctions undeniably, such as the moras of New Zealand. So does any alpha predator in a new ecology. Alpha predators like humans play keystone roles in any ecology, and introducing such a predator into any new ecology will cause cascades of change. Some species will prosper; others will adapt; still others will go extinct. Moving into a new ecology during a significant climate change meant that the new variable was more than many species could handle. Animals already in decline could not handle the extra pressure, and went extinct. This is not a distinctly human behavior: we can see much the same in Yellowstone:
Ripple points to some black-and-white photographs taken of the same spot in the Lamar Valley more than 50 years apart. “You can see that young aspen and willow were abundant in the early 1900s. By the 1930s the trees had stopped regenerating, and there are no young ones.
“I had a lightbulb,” he continues. He took core samples from 98 aspen trees and discovered that only two had begun to grow after the 1920s—around the time the last substantial populations of wolves were killed or driven off. And these two were in places that elk would be hesitant to frequent for fear of being attacked by predators. Ripple found big trees and tiny trees but nothing in between, because nothing new grew from the 1930s to the 1990s. It was the first concrete evidence of a “wolf effect.”
After human farmers drove the wolves from Yellowstone, the elk populations boomed. They stripped their food supply bare, eating the shoots of young trees as they came up. That began to change the ecology of Yellowstone, driving out songbirds, introducing erosion, and generally wreaking havoc. The re-introduction of wolves to Yellowstone in 1995 has been a smashing success. Elk numbers have gone down, trees have returned, songbirds have come back, and the effects of erosion are beginning to heal. Such is the kind of far-reaching relationship that any alpha predator shares with its ecology.
Given that understanding of an alpha predator’s role in any ecology, and the foregoing evidence, how can we conclude that all of these species were wiped out by a few African apes, without any input whatsoever from the changing ecology around them? No, there was no noble savage; but there was no murderous savage, either. Humans were not created good or evil—just human. Our entrance into the Americas, Oceania and the rest of the world was as harmless as wolves, lions or sharks. My words there are carefully chosen. We don’t normally consider wolves, lions or sharks particularly “harmless,” and neither were humans. But we recognize the place such predators have in the natural world. We recognize that they’re part of a bigger picture. We know that introducing them into a new situation will have far-reaching effects on that situation, but we also know that’s not a reflection of their own nature, but the nature of ecology itself. Just like humans.
Labor & Leisure
What’s more, the famously “affluent society” of hunter-gatherers, with plenty of time to gossip by the fire between hunts and gathers, turns out to be a bit of a myth, or at least an artefact of modern life. The measurements of time spent getting food by the !Kung omitted food-processing time and travel time, partly because the anthropologists gave their subjects lifts in their vehicles and lent them metal knives to process food.
[F]oragers work much less than we do today. Richard Lee’s initial assessment of the !Kung work week is neatly summarized by Sahlins:
Despite a low annual rainfall (6 to 10 inches), Lee found in the Dobe area a “surprising abundance of vegetation”. Food resources were “both varied and abundant”, particularly the energy rich mangetti [a.k.a., mongongo] nut—”so abundant that millions of the nuts rotted on the ground each year for want of picking.” The Bushman figures imply that one man’s labour in hunting and gathering will support four or five people. Taken at face value, Bushman food collecting is more efficient than French farming in the period up to World War II, when more than 20 per cent of the population were engaged in feeding the rest. Confessedly, the comparison is misleading, but not as misleading as it is astonishing. In the total population of free-ranging Bushmen contacted by Lee, 61.3 per cent (152 of 248) were effective food producers; the remainder were too young or too old to contribute importantly In the particular camp under scrutiny, 65 per cent were “effectives”. Thus the ratio of food producers to the general population is actually 3:5 or 2:3. But, these 65 per cent of the people “worked 36 per cent of the time, and 35 per cent of the people did not work at all”!
For each adult worker, this comes to about two and one-half days labour per week. (In other words, each productive individual supported herself or himself and dependents and still had 3 to 5 days available for other activities.) A “day’s work” was about six hours; hence the Dobe work week is approximately 15 hours, or an average of 2 hours 9 minutes per day.
This is the oft-quoted “two hours a day” statistic, but it has come under fire from critics who point out that Lee did not add in other necessary activities, such as creating tools, and food preparation. So, Lee returned to do further study with these revised definitions of “work,” and came up with a figure of 40-45 hours per week. This might seem to prove that hunter-gatherers enjoy no more leisure than industrial workers, but the same criticisms laid against Lee’s figures also apply against our “40 hour work week.” Not only is that increasingly a relic of a short era sandwiched between union victories and the end of the petroleum age as the work week stretches into 50 or even 60 hours a week, but it, too, does not include shopping, basic daily chores, or food preparation, which would likewise swell our own tally. Finally, the distinction between “work” and “play” is nowhere nearly as clear-cut in forager societies as it is in our own. Foragers mix the two liberally, breaking up their work haphazardly, and often playing while they work (or working while they play). The definition of work which inflates the total to 40-45 hours per week includes every activity that might be considered, regardless of its nature. Even the most unambiguous “work” of foragers is often the stuff of our own vacations: hunting, fishing, or a hike through the wilds.
Agriculture as Adaptation
Agriculture was presumably just another response to demographic pressure. A new threat of starvation—probably during the millennium-long dry, cold “snap” known as the Younger Dryas about 13,000 years ago—prompted some hunter-gatherers in the Levant to turn much more vegetarian. Soon collecting wild grass seeds evolved into planting and reaping crops, which reduced people’s intake of proteins and vitamins, but brought ample calories, survival and fertility.
“Thesis #10: Emergent Elites Led the Agricultural Revolution,” The Anthropik Network, 11 October 2005:
The hypothesis states that agriculture had to be adopted because of rising populations through the Mesolithic. Yet, for any given grain of wheat, there is a decision to be made. One can either eat it, or plant it, but never both. Planting wheat is an investment of food; it’s sacrificing food now, in order to have more food in the future. Investment is not an activity engaged in by people lacking resources; it’s something only people with resources to spare indulge in. Poor people aren’t very big in the stock market, and starving people who buried all their rice would never survive long enough to reap the harvest. We take it nearly without argument that the Neolithic began with increasing, hungry populations, but there are two questions left unanswered:
- Since human population is a function of food supply, where did this population come from? and
- Why did starving populations bury their wheat, instead of eat it?
The end of the Pleistocene fluctuated the climate, alternating between times of plenty and times of want. While starvation is rare and it would be a stretch to call the bad times “famine,” some years are undeniably harder than others.
In such uncertain times, “Big Men” emerge, providing some level of stability. In fat years, their lavish potlatches and mokas increase their own prestige and indebt neighboring groups–providing insurance against the hard years that will follow. These Big Men further bolster their position within the group, and cultivate a reciprocity network beyond the group, by using their power and influence to engage in long-distance trade. As a last resort, when all other possibilities are gone, they can call on neighboring Big Men to provide food.
These late Mesolithic foragers spend more and more time cultivating at more intensive levels, to produce enough food for the escalating competition of the Big Men’s feasts. It is hard, and they must sacrifice the freedom and leisure of their former life, but at least they have some security. Eventually, those Big Men have sufficient influence to make their followers stop thinking of themselves as hunters who farm, and begin thinking of themselves as farmers who hunt.
Big Men become chiefs, chiefs become kings, populations explode and civilization moves inexorably from that beginning to the present crisis.
In the years since 9/11, a quote from Benjamin Franklin has enjoyed renewed popularity in certain circles: “They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” The loss of civil liberties and freedoms suffered by the United States’ citizenry under the second Bush regime, though significant, remain small when compared to the freedoms lost 10,000 years ago when our forebears (memetically, if not genetically) took up civilization. Agriculture is a hard life, as we have already seen. Malnutrition and disease followed almost immediately; war, tyranny and poverty followed inexorably. By relying solely on domesticated crops, intensive agriculture becomes the only subsistence technology that is truly susceptible to real famine. The safety the Big Men offered was illusory; in fact, that ancient bargain put us in a more precarious position than we had ever known–or will likely ever know again.
Ten thousand years ago, our ancestors traded the bulk of that very real freedom that is our species’ birthright, for a little temporary safety. If there is an original sin, a fall of man, that was it. From that day to this, we have not deserved—nor have we had—either one.