“The Savages are Truly Noble”

by Jason Godesky

In his 1609 Histoire de la Nouvelle France, Marc Lescarbot noted that the L’nu (popularly known as the Mi’kmaq or “Micmac”) allowed everyone to hunt freely. In Europe, hunting was permitted only for the nobility, leading Lescarbot to note, “the Savages are truly noble.” The term “Noble Savage” first appears in English in John Dryden’s 1672 play, The Conquest of Grenada: “I am as free as Nature first made man, / Ere the base laws of servitude began, / When wild in woods the noble savage ran.” The term then disappeared for nearly two centuries, only resurfacing again in English with the derogatory use of John Crawfurd, president of the Ethnological Society of London, in 1859.

Ter Ellingson’s The Myth of the Noble Savage is not simply another debunking of the “Noble Savage” myth to assuage our guilt by telling us that Native peoples are as warlike and destructive as we are today. Rather, the myth of the “Noble Savage” is that anyone ever believed it at all.

Ellingson devotes the last two thirds of the book to a consideration of “Popular Views of the Savage” (the title of Ch. 11) and the emergence of the racist anthropology that was promoted most strongly by James Hunt, a member of the Ethnological Society of London and a strong polemicist for the superiority of the white race. Here Ellingson’s narrative moves more toward an examination of the politics of science, describing a “disastrous meeting” of the Ethnological Society in 1858 or 1859 at which only six members were in attendance. Subsequent events allowed Hunt and Crawfurd to take control of the society—as secretary and president, respectively—and use it as a forum to promote their racist views.1

The term “Noble Savage,” as we have it today, was a straw man concocted by white supremacists in order to take over the British Ethnological Society, in order to end its opposition to slavery, colonialism, and other imperial atrocities, and instead use it as a vehicle to advance their own racist views.

Another physical anthropologist—or anatomist—was Dr. Robert Knox, who was involved as a customer of the notorious Burke and Hare serial murder/body snatching case and had to resign in disgrace from the Medical College of Edinburgh. Knox subsequently made his living lecturing about the supposed superiority of the Anglo-Saxon “race.” James Hunt, an anthropologist who was Knox’s disciple, also exalted the “Anglo-Saxons” but his colleague John Crawfurd (a Scot) held the Scots race to be “pure” and the Anglo-Saxons a mixed or “bastard” race. In 1859. Hunt and Crawfurd banded together to take over the British Ethnological Society. They attacked the “romantic philanthropists” of the previous generation of liberal ethnologists who had opposed slavery and colonialism and advocated for the wrongs of indigenous peoples, calling their predecessors “romantic philanthropists” and believers in “The Noble Savage”—Crawford, wrote a paper of that name.2

Of course, regardless of the history of the term, there can be little doubt that writers, poets and activists, if not anthropologists, have believed in the basic form of the “Noble Savage.” As Rousseau began the Emile, “Everything is good in leaving the hands of the Creator of Things; everything degenerates in the hands of man.” Defining the precise elements of the “Noble Savage” can be difficult, but for our purposes, we can consider five facets:

  1. The Ecological Saint. The Noble Savage lives in harmony with nature.
  2. The Gentle People. The Noble Savage lives without war or violence.
  3. The Honest Injun. The Noble Savage lacks guile or deceit.
  4. The Super Human. The Noble Savage possesses uncanny physical health and senses.
  5. The Wise Indian. The Noble Savage has an innate wisdom and spiritual connection.

Some of these are merely exaggerations; others are not true at all. Yet, most people who try to discredit the “Noble Savage” offer up opinions far more inaccurate than those presented in the myth itself. Originally concocted as a straw man by white supremacists, the invocation of the “Noble Savage” today remains much the same—to justify any act of cruelty, violence, or ecological destruction we’re responsible for, by throwing up the scantiest evidence we can find to suggest that humans have always done such things, even when an honest examination of the evidence clearly indicates that something has radically changed.

The Ecological Saint

Natives are nature’s custodians; there’s another fallacy popularized by Jean-Jacques Rousseau’s panegyric on the Noble Savage. Voltaire was in the know when he said that Rousseau is to philosophers as the ape is to man. Rousseau certainly was uninformed by facts when he described natives as living in unity with nature. Less forgivable are the many present-day authors and researchers who, despite the corpus of research attesting to the lack of conservation among natives, persist in describing pre-Columbian America as “a pristine natural kingdom”.3

William Cronon’s famous 1995 essay, “The Trouble with Wilderness; or, Getting Back to the Wrong Nature ,” detailed the Romantic underpinnings of the notion of “wilderness” as something separate from human activity. From this vantage point, Native Americans were no better than modern Americans; they, too, changed their environment, they used fire to maintain continent-wide gardens, they terraformed the Amazon and the Great Plains, and reshaped the Americas continually. They wiped out the pristine “wilderness” by existing in it. Of course, as Luther Standing Bear wrote:

We did not think of the great open plains, the beautiful rolling hills, and winding streams with tangled growth, as “wild.” Only to the white man was nature a “wilderness” and only to him was the land “infested” with “wild” animals and “savage” people.

Those who attack the image of the ecologically noble savage point to horticultural techniques as if they were equivalent to modern-day logging, while paying little attention to what such techniques actually entailed. As Toby Hemenway pointed out:

Until the late 20th century, western anthropologists studying both ancient and current tropical cultures viewed equatorial agriculture as primitive and inefficient. Archeologists thought the methods were incapable of supporting many people, and so believed Central and South America before Columbus—outside of the major civilizations like the Aztec, Maya, and Inca—held only small, scattered villages. Modern anthropologists scouted tropical settlements for crop fields—the supposed hallmark of a sophisticated culture—and, noting them largely absent, pronounced the societies “hunter gatherer, with primitive agriculture.” How ironic that these scientists were making their disdainful judgements while shaded by brilliantly complex food forests crammed with several hundred carefully tended species of multifunctional plants, a system perfectly adapted to permanent settlement in the tropics. It just looks like jungle to the naive eye.4

Horticultural techniques used by Native groups (many of them are now employed by permaculturalists) encouraged higher levels of succession, with species that supported human prosperity while simultaneously encouraging the biodiversity and health of their ecosystem. Rather than putting the two at odds like a modern logger, horticultural techniques put human welfare and ecological health in tandem. Yes, they reshaped the land around them and had a significant ecological impact. If we’re looking for an animal that has no impact on the world around it, we’ll be continually disappointed. They did not share our cultural fantasy about the unique separation of Homo sapiens from the living world. Rather, they took their responsibilities seriously: to give back to the land more than they took from it. This is the essence of sustainability, and we can see the clear evidence of Native sustainability in our own existence. As Derrick Jensen points out, what’s worked in the long term has worked in the long term. Since the Industrial Revolution, we’ve entered an era of mass extinction, with 200 species being wiped out every day. If Homo habilis was truly as destructive as the modern American, then Homo habilis would barely appear in the archaeological record at all, and no humans would be alive today.

Critics invariably point to the extinction of the megafauna as evidence that Native Americans were as destructive as modern Americans, citing the “Overkill” theory as if it were established fact. In reality, it’s almost certainly wrong.

The overkill hypothesis, Grayson says, rests on five tenets: human colonization can lead to the extinction of island species; the Clovis people were the first humans to arrive in North America, around 11,000 years ago; the Clovis people hunted a wide range of large mammals; the extinction of many species of North American megafauna occurred 11,000 years ago; and therefore, Clovis hunting caused those extinctions.

Grayson disputes several of these tenets.

There is no proof, he said, that the late Pleistocene extinctions occurred in conjunction with the arrival of the Clovis people. “Of the 35 genera to have become extinct beginning around 20,000 years ago, only 15 can be shown to have survived beyond 12,000 years ago,” Grayson said. “The Clovis peoples didn’t arrive until shorty before 11,000 years ago. That leaves 20 [genera] unaccounted for.”

There is also no evidence that the Clovis people hunted anything other than mammoths, he said. Although numerous sites where large numbers of mammoths were killed have been uncovered, no similar sites for any other large mammals have been found in North America.

And while there is no evidence of widespread human-caused environmental change similar to that seen on island settings, there is evidence that animal populations in Siberia and Western Europe, as well as North America, were affected during the same period by climate changes and glacial retreat.5

Furthermore, mounting evidence suggests that much of the extinction—including that of several animals humans didn’t hunt—were already well underway before humans came to the Americas.

A team of 27 scientists used ancient DNA to track the hulking herbivore’s boom-and-bust population patterns, adding to growing evidence that climate change was to blame.

“The interesting thing that we say about the extinctions, is that whatever happened, it wasn’t due to humans,” said the paper’s lead author, Beth Shapiro, a Wellcome Trust Research Fellow at Oxford University. By the time people arrived, “these populations are already significantly in decline and on the brink of whatever was going to happen to them in the future.”

The story written into the bison’s DNA is one of an exponential increase in diversity with herd sizes doubling every 10,200 years. Then, 32,000 to 42,000 years ago, the last glacial cycle kicked in, beginning a lengthy cooling trend. Bison genetic diversity plummeted. A significant wave of humans didn’t appear in the archaeological record at eastern Beringia until more than 15,000 years later, the authors write in Friday’s Science.6

That’s not to say that the introduction of humans into the Americas was without consequence; we would expect a certain number of extinctions as the New World ecologies adapt to the prescence of a new alpha predator, particularly under changing climatic conditions that were already leading to extinctions as the evidence shows. But this does not point to an innate human destructiveness beyond that of any predator that enters a new ecology. The re-introduction of wolves to Yellowstone provides an excellent illustration.

Scientists do say that wolf predation has been significant enough to redistribute the elk. That has in turn affected vegetation and a variety of wildlife.

The elk had not seen wolves since the 1920’s when they disappeared from the park. Over the last 10 years, as they have been hunted by wolf packs, they have grown more vigilant.

They move more than they used to, and spend most of their time in places that afford a 360-degree view, said Mr. Smith. They do not spend time in places where they do not feel secure—near a rise or a bluff, places that could conceal wolves.

In those places willow thickets, and cottonwoods have bounced back. Aspen stands are also being rejuvenated. Until recently the only cottonwood trees in the park were 70 to 100 years old. Now large numbers of saplings are sprouting. …

Where willows and cottonwoods have returned, they stabilize the banks of streams and provide shade, which lowers the water temperature and makes the habitat better for trout, resulting in more and bigger fish. Songbirds like the yellow warbler and Lincoln sparrow have increased where new vegetation stands are thriving.

Willow and aspen, food for beaver, have brought them back to the streams and rivers on the northern range. In 1996, there was one beaver dam on the northern range; now there are 10.

The number of wolves has also greatly increased the amount of meat on the ground to the benefit of other species.

Grizzlies and coyotes rarely kill adult elk, but each pack of wolves kills an elk every two or three days. After they eat their fill, other carnivores take over the carcass. Opportunistic scavengers like magpies and ravens make a living on the carcasses.

The number of coyotes, on the other hand, has fallen by half. Numbers of their prey—voles, mice and other rodents—have grown. And that, in turn bolsters the populations of red foxes and the raptors.7

Why would we not expect the introduction of humans to have similarly far-reaching effects? We can clearly see that when humans entered the Americas, it was already a time of significant ecological change. The same climate shifts that made it easier for the first humans to find their way to the Americas likewise pushed other species to the brink of extinction. For some, the cascading ecological effects of a new species may have been sufficient to push them over. The Clovis may even have overhunted mammoth. But the idea that ancient Native American overhunting was the primary, or even a major cause of the extinction of the Holocene simply doesn’t stand up to critical scrutiny.

If our standard is some fantastic mythology of an animal that has no impact on the living world, then we can certainly find evidence to indict primitive cultures. But such cultures would be the first to reject such a ludicrous standard. Daniel Quinn called them “as harmless as a shark, tarantula, or rattlesnake.” They know full well that they’re part of the world, and have as much to do with it as anything else in it. That’s precisely why it’s so important in their world to give back more than you take. The first European colonists to come to the Americas noted skies turned black by flocks of birds, and rivers so full of fish you could catch them with your bare hands.

Our ancestors - the ancestors of some of us, that is - knew. When Europeans first arrived on the land that would eventually become the United States they found a land truly blessed by the divine. Their accounts speak of an abundance few would recognize today.

On the East Coast, birds, including now-extinct species such as the great auk, could be found in “number so great as to be uncountable,” as one contemporary wrote. Passenger pigeons flew in flocks of billions, darkening the sky for days at a time as they passed overhead. Eskimo curlews, puffins, teals, plovers and more could be found in numbers genuinely unthinkable today.

And that’s just to speak of the East Coast, and just to speak of birds. Writing from the Pacific Northwest in the 17th century, Nicolas Denys noted that “so large a quantity of salmon enter[ed] the river [that] at night one [was] unable to sleep.” Elsewhere cod were “so thick by the shore that [one] could hardly have been able to row a boat through them.”

In 1620, the crew of the Mayflower noted “every day saw whales plying hard by us; of which, in that place, if we had instruments and means to take them we might have made a very rich return.” Tens of millions of buffalo dwelt on all corners of the continent, as did wolves and great cats.8

Such mind-blowing abundance was the accumulated gift of hundreds, if not thousands, of generations, each one living to give back more than they took. It’s true they had an impact on their ecology, but impacts can be positive or negative. To justify the clearcut moonscapes Weyerhaeuser leaves behind with the horticultural techniques of Native groups that cultivated such abundance is nothing short of obscene.

Science reveals a rather different picture of humanity in its natural state. In a 1996 study University of Michigan ecologist Bobbi S. Low analyzed 186 pre-industrial societies and discovered that their relatively low environmental impact is the result of low population density, inefficient technology and lack of profitable markets, not conscious efforts at conservation.9

Of course, that’s exactly the point. The “Noble Savage” is not ennobled by any kind of innate moral superiority, but by a way of life that fundamentally works. Ethical injunctions always fail, because they pit a bad conscience against self-interest. Working societies channel self-interest in directions that are sustainable. Primitive societies don’t give back more than they take because the people are so much kinder and better than we are; they do so because that’s what their societies are structured to accomplish, as systems. What they might deride as “low population density, inefficient technology and lack of profitable markets,” might more accurately be termed human-scale societies, simple, elegant technologies, and an emphasis on people over products in a gift economy. Of course this wasn’t the result of “conscious efforts at conservation.” Such efforts invariably fail. Their success laid in precisely the fact that it was not conscious, and that they weren’t efforts at conservation, but efforts to build wealth and prosperity for them, their children, their community, and the non-human communities they lived with.

The Gentle People

Conventional history has long shown that, in many ways, we have been getting kinder and gentler. Cruelty as entertainment, human sacrifice to indulge superstition, slavery as a labor-saving device, conquest as the mission statement of government, genocide as a means of acquiring real estate, torture and mutilation as routine punishment, the death penalty for misdemeanors and differences of opinion, assassination as the mechanism of political succession, rape as the spoils of war, pogroms as outlets for frustration, homicide as the major form of conflict resolution—all were unexceptionable features of life for most of human history. But, today, they are rare to nonexistent in the West, far less common elsewhere than they used to be, concealed when they do occur, and widely condemned when they are brought to light.

At one time, these facts were widely appreciated. They were the source of notions like progress, civilization, and man’s rise from savagery and barbarism. Recently, however, those ideas have come to sound corny, even dangerous. They seem to demonize people in other times and places, license colonial conquest and other foreign adventures, and conceal the crimes of our own societies. The doctrine of the noble savage—the idea that humans are peaceable by nature and corrupted by modern institutions—pops up frequently in the writing of public intellectuals like José Ortega y Gasset (”War is not an instinct but an invention”), Stephen Jay Gould (”Homo sapiens is not an evil or destructive species”), and Ashley Montagu (”Biological studies lend support to the ethic of universal brotherhood”). But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler.10

Of course, to support his contention, Pinker points to sixteenth century Paris, Biblical Israel, and medieval Europe, some of the most civilized examples of savages ever offered up. He does eventually address “savages” in his critique of the Noble Savage, in one paragraph easy to lose in the dismissal of anyone who might doubt civilized beneficence:

Contra leftist anthropologists who celebrate the noble savage, quantitative body-counts—such as the proportion of prehistoric skeletons with axemarks and embedded arrowheads or the proportion of men in a contemporary foraging tribe who die at the hands of other men—suggest that pre-state societies were far more violent than our own. It is true that raids and battles killed a tiny percentage of the numbers that die in modern warfare. But, in tribal violence, the clashes are more frequent, the percentage of men in the population who fight is greater, and the rates of death per battle are higher. According to anthropologists like Lawrence Keeley, Stephen LeBlanc, Phillip Walker, and Bruce Knauft, these factors combine to yield population-wide rates of death in tribal warfare that dwarf those of modern times. If the wars of the twentieth century had killed the same proportion of the population that die in the wars of a typical tribal society, there would have been two billion deaths, not 100 million.11

Lawrence Keeley’s War Before Civilization: The Myth of the Peaceful Savage really represents the first work in this trend, and continues to provide the foundation for the later work of LeBlanc, Walker and Knauft. Much of Keeley’s evidence centers around archaeological evidence of fortified villages around the world. Of course, that merely proves that food producers engaged in warfare; Keeley’s farther-reaching assertion that warfare is endemic to human nature goes far beyond the evidence he provides of horticultural warfare, and his work has been heavily criticized on this point. Though we have cave paintings back into the Upper Paleolithic 40,000 years ago, it is only about 10,000 years ago, with the invention of the bow, that we see the first cave paintings of groups fighting. We don’t see paintings of people fighting with clubs or even atlatls, but we instead see them fighting first with bows and arrows. Some paintings even portray still-recognizable tactical techniques like flankings and envelopments. It is also at this time that the first skeletal evidence of warfare emerges, with bones showing evidence of violent death, arrow-heads in skeletal remains, and so forth. This is quite late in our history as a species, and once again correlates to the rise of food production.

Keeley tries to expand his case by looking at evidence of violence among modern hunter-gatherers, most notably the Plains Indians. “Keeley finds brutish behavior everywhere and at all times, including among the American Indian. If the number of casualties produced by wars among the Plains Indians was proportional to the population of European nations during the World Wars, then the casualty rates would have been more like 2 billion rather than the tens of millions that obtained.”12 Though anthropologists have learned that modern hunter-gatherers are not “living fossils” that necessarily preserve pre-civilized behaviors when such comparisons illustrate the basic economics of hunter-gatherer life and the pressures it places on sharing and social bonds, Keeley presents modern hunter-gatherers as precisely that when he needs evidence of prehistoric warfare. The Plains Indians are a particularly ironic choice, given the evidence Peter Farb gathers in Man’s Rise to Civilization, As Shown by the Indians of North America from Primeval Times to the Coming of the Industrial State. There, Farb shows that the Plains Indians we know did not exist prior to European contact. They descended from refugees from other Native groups destroyed by the various European epidemics that wiped out 90% or more of North America’s population in the years after 1492, with a new culture assembled around two important European introductions: the re-introduction of the horse as wild herds profligated and filled up the Americas, and guns traded from French fur trappers. The Plains Indians had a post-apocalyptic culture.

Given the trauma of what was essentially the end of the world for Native groups, a surge in violence would be expected. 90% or more of the American population died from epidemic disease. Groups were displaced, and a massive rearrangement of tribal territories racked across the continent like billiard balls long in advance of European settlers. When “the Pilgrims” came to Plymouth, they were aided in setting up their new colony by a Patuxet native named Tisquantum (better known to Europeans as “Squanto”). He was the primary contact for the Europeans on behalf of the Wampanoag Confederacy because he already knew English: he was captured by George Weymouth in 1605, worked for nine years in London, and returned to the New World with John Smith (of “Pocahantas” fame) in 1613, but was dropped off on the wrong part of the continent and kidnapped by Thomas Hunt, but escaped to London. He finally made it back home in 1619, to discover that his home village had been wiped out, probably by smallpox. When “the Pilgrims” arrived, they set up their colony on the ruins of Tisquantum’s home town. Tisquantum’s twisted tale illustrates the enormity of European impact on Native lives long before what we would normally consider the first point of European contact. By the time Europeans came, most places they arrived were already shattered. The archaeological record bears out a significant increase in violence in this post-apocalyptic era.

Researchers examined thousands of Native American skeletons and found that those from after Christopher Columbus landed in the New World showed a rate of traumatic injuries more than 50 percent higher than those from before the Europeans arrived.

“Traumatic injuries do increase really significantly,” said Philip L. Walker, an anthropology professor at the University of California at Santa Barbara, who conducted the study with Richard H. Steckel of Ohio State University.

The findings suggest “Native Americans were involved in more violence after the Europeans arrived than before,” Walker said. But he emphasized there was also widespread violence before the Europeans came.

Nevertheless, he said, “probably we’re just seeing the tip of the iceberg” as far as the difference between violence levels before and after. That’s because as many as half of bullet wounds miss the skeleton. Thus, the study couldn’t detect much firearm violence, though some tribes wiped each other out using European-supplied guns.13

Of course, such an increase does indicate that violence occured prior to any European contact at all. That’s certainly true of the New World states in Mesoamerica and the Andes, but some level of violence should be expected at all times and places. Violence has a place in the natural world. Part of our civilization’s twisted view of the world is its inability to come to terms with violence in a healthy way. There is an inherent violence to animal life—a fact that philosophers have often commented upon. The chopstick has its origins in Confuscius’ notice that stabbing food with a knife is a violent act, and his desire to move away from that. Buddhist desire to do no harm finds its apex in some monks who wear shoes like short stilts, to minimize the possibility of stepping on a bug. Ethical vegetarians often eschew the eating of meat because they do not want to kill anything, so instead, they kill vegetables. Whenever and wherever we try to eliminate the harm we do, we find ourselves running into the essentially violent nature of animal life: we live because others die.

Shamanism is a hunter’s religion. The forager life does not afford the luxury of such self-deceit: it requires the predator to make his peace with his animal nature, including the inherent, inescapable fact of violence. No animal lives, except by the deaths of others. The mediation of life and death in a universe carefully balanced between them is at the heart of everything the shaman does. At the heart of agricultural philosophy, however, is a desire to escape this system: to have life, and never death; growth, and never decay; health, and never sickness. In the end, it is a fool’s dream doomed to failure, but the longer it goes on, the more death, decay, and sickness is needed to balance out the folly. Normally, the forager’s life balances these in small, manageable portions. Civilization distinctly amplifies violence in its unhealthy flight from it.

Because of this, civilization cannot tolerate ambiguity. We demonize our Tricksters, rather than countenance their ambivalence. The world must be divided between black and white, good and evil—with good defined in ever more narrow terms, and no deviation from it tolerated.

The foundation of civilization, the cornerstone of the state, is an idea first formalized by Max Weber in a 1918 speech, Politik als Beruf (Politics as a Vocation), when he said:

“Every state is founded on force,” said Trotsky at Brest-Litovsk. That is indeed right. If no social institutions existed which knew the use of violence, then the concept of “state” would be eliminated, and a condition would emerge that could be designated as “anarchy,” in the specific sense of this word. Of course, force is certainly not the normal or the only means of the state—nobody says that—but force is a means specific to the state. Today the relation between the state and violence is an especially intimate one. In the past, the most varied institutions—beginning with the sib—have known the use of physical force as quite normal. Today, however, we have to say that a state is a human community that (successfully) claims the monopoly of the legitimate use of physical force within a given territory. Note that “territory” is one of the characteristics of the state. Specifically, at the present time, the right to use physical force is ascribed to other institutions or to individuals only to the extent to which the state permits it. The state is considered the sole source of the “right” to use violence. Hence, “politics” for us means striving to share power or striving to influence the distribution of power, either among states or among groups within a state.

Fundamental to the existence of the state is the myth that peace requires the members of a society to relinquish their natural born right as animals to use violence, and instead invest that right with the state. The idea is that by allowing the state this monopoly of force, it may control and meter the use of force, resulting in less violence overall.

If I kill someone, it is “murder”—a killing that is considered unjustified. If a policeman or a soldier kills someone, it is usually justified. The criteria for justified and unjustified violence are left for the state to decide, and part of what makes it a state is that the only legitimate uses of violence are by, and for, the state, as with the military, or a para-military group, like the police. Under pressure after especially egregious abuses of this power, the state may make some uses of force that were previously justified, unjustified, and may institute various review procedures for police, soldiers, or other state-approved killers, but this is ultimately an elaboration on the state’s prerogative to determine arbitrarily what constitutes justified and unjustified violence.

We have been trained to think of violence as “wrong” since birth, because it is essential for the state’s continued existence that we accept the myth of its legitimacy, and the founding premise of that so-called legitimacy, the monopoly of force. Only by turning violence into an “evil” thing in the same way we demonize the Trickster can we willingly shed ourselves of such a natural function of animal life. Only then can we believe in the monopoly of force—and only then are we willing to accept its dominance over us, and the mythology of “legitimacy,” whereby our natural born freedom is relegated to “rights” for the state to arbitrarily grant or deny, seems not only reasonable, but the way things should be. Without accepting first that violence is “wrong,” none of the systems of control and domination follow. Yet, we can hardly find any reason why it should be so. Even if we do accept the perverse notion that violence is “wrong,” rather than simply an ambivalent and ever-present feature of animal life, is there any evidence that the state has succeeded in reducing violence through its monopoly of force?

Keeley cites evidence that the wars of the Plains Indians were, per capita, as costly as the two world wars. He further cites evidence that the !Kung homicide rate rivals that of the inner city. Knauft14 points out that such violence comes from the changing nature of Bushmen life as their old way of life is shattered and they are settled into a new, sedentary lifestyle. Kent15 further points out that Keeley’s statistic is taken out of context. In addition to these criticisms, I would draw attention to the double standard applied to violence in our own civilization, versus how it is applied to “primitive” societies.

Interview with Derrick Jensen concerning the violence inherent to civilization.

Keeley makes the argument that, because horticultural and forager peoples have such low population density, a single homicide could be a war. Of course, Keeley ignores the systematic violence of civilization: he lumps together all occasions of violence in “primitive” societies, and compares them to our own wars, or our own homicide rates, but never the combination. Neither does he take into consideration the toll of our violence professionals: the violence done by police, or incarceration (1 in 37 adults in the United States), or the essential violence of the market and tax system we’re forced into by civilization’s violence professionals. In civilization’s pursuit of a peaceful society, the “Monopoly of Force” has succeeded only in making every aspect of society violent. When we consider fully that the violence Keeley cites is that of an essentially post-apocalyptic society, and that he is comparing it to only a small portion of our own violence, it becomes clear that contrary to Keeley’s case, civilization has vastly increased violence. It has made violence ubiquitous, and it has changed the fundamental nature of it.

Foragers were not perfectly peaceful, but they also limited violence and aggression, just like all other animals. Tennyson may have described nature as “red in tooth and claw,” and that view certainly permeated Darwin’s understanding of nature when he came up with his theory of evolution, but we have since learned that natural selection typically operates on a far less lethal basis. A slightly higher birth rate is all that’s required for one variety to out-compete another on the long term; examples of outright starvation and violence exaggerate to make a point. Rather, when we observe violence in the animal kingdom, we see that animals try to avoid it wherever they can. After all, on average, any animal has only a 50% chance of surviving a fight to the death. Lethal force is not an evolutionarily stable strategy; such hyper-aggressive animals destroy themselves on a fairly short timeline. Successful animals have learned to limit aggression and violence.

Probably the most spectacular examples of ritualization involve ritualized aggression; as E.O. Wilson formulated the key question: “Why do animals prefer pacifism and bluff to escalated fighting?”. The answer is that agonistic behavior is potentially advantageous but also carries very high costs (in terms of the individual’s inclusive fitness): possible injury and even death, ‘aggressive neglect’, etc. (see especially Cronin and Van der Dennen for detailed analysis). For each species, Wilson observes, there exists some optimal level of aggressiveness above which individual fitness is lowered.16

Foragers, too, have succeeded in the long term, and like other animals, they have done so only because they have succeeded in limiting violence. Forager groups constantly tested their boundaries with one another in skirmishes. In extremely marginal ecologies, like the Arctic, foragers are even extremely violent, with long-lasting blood feuds. But, as Raymond Kelly discusses in Warless Societies & the Origins of War, marginal societies tend not to have the resources to go to war, and affluent societies (like most foragers) have no reason to go to war. So, like agriculture (see thesis #10), warfare develops not from strict scarcity or abundance, but the nerve-wracking, extreme fluctuations between them, with periods of scarcity priming a population’s fears, and following abundance providing them the resources to act on those fears.

Primitive violence was primarily psychological in nature, what we might today call “terrorism.” Even today, terrorist attacks produce psychological effects that far outweigh their impact in terms of casualties or damage. The worst terrorist attacks in recent history, on 11 September 2001, killed 3,000 people, but the psychological effect led to even greater changes, including the formation of the Department of Homeland Security, the passing of the PATRIOT Act, and wars in Iraq and Afghanistan. In the initial “shock-and-awe” bombing of Baghdad alone, some 9,000 civilians were killed, three times as many as died on 9/11. Obviously, the psychological effect was far greater.

Primitive violence relied on these kinds of tactics: they minimized actual loss of life, while maintaining the peace by projecting an image of terror. Neighbors would not attack one another for fear of defeat. Scalping, once thought to have been a European innovation, has been noted archaeologically in North America—as has considerable evidence of survivors of scalping incidents. More important than destroying an enemy was reminding the enemy why they didn’t want to attack you.

In fact, among certain tribes, there appears to have been great honor associated with the infiltration of an enemy village to make a kill. Catlin reports that chief Four Bears boasted of sneaking to a point within view of an enemy village, spying on the village for several days, then killing two women in full sight of the other members of the enemy village. Swanton goes a step further and states that among the Creek, the taking of a woman’s or child’s scalp was considered a sign of even greater valor than the taking of a scalp on the battlefield, because it indicated that the warrior had penetrated all the way into the enemy’s village, a feat which required great skill.17

Other elements of primitive warfare continued to emphasize psychological warfare over total warfare; the practice of “counting coups” on an enemy placed greater honor on rushing up to an enemy and touching him, than actually killing him. Such a display proved the superior bravery and skill of a warrior even more than a kill (which guns made all too easy a matter). Such displays were meant to intimidate neighboring people, who would then be too afraid to attack.

In many prior accounts it was stated that [Plains] Indians’ most prized achievement in battle was the scalping of a fallen enemy. Grinnell maintains that the greatest honour among tribes in battle was not to scalp the enemy but to touch him during battle. The scalping is instead a trophy of the victory, which is later used in ceremonial dances. The touching of one’s enemy, whether (dead or alive) by hand or with something held in the hand is called counting coup. The first man to count coup was considered the most honourable and would receive praises when returning from battle. The second man to touch the enemy was in turn considered the second most honourable and so on. It is not clear how many man are allowed to count coup on one enemy.18

When and where primitive warfare led to actual battles, these are primarily ritualized contests. Such ritual confrontations are frequent among non-human animals as well. Before committing to a fight, it is best to “size up” an opponent. Such rituals essentially put each combatant’s prowess on display. In most cases, there will be a severe mismatch, so rather than risk certain defeat, the weaker animal simply cedes to the stronger, avoiding injury and possibly even death. For his part, the stronger animal now has the advantage of victory without whatever injury might have been suffered in actual combat.19 The same attitude was taken by Japanese samurai in comparing stances, sometimes revealing differences obvious enough to both parties that the duel became unnecessary. Ritualized primitive battles should be understood in the same way.

Of course, sometimes this last effort to avoid confrontation fails; when that happens, such rituals also help to prepare for the battle to come.

Ritual reduces anxiety and fear and institutes confidence, while at the same time giving assurance of ultimate meaning. It reinforces the solidarity of the group by dramatizing its status structure. It strengthens group boundaries, justifies its hostile or defensive activities, and expiates its guilt. It especially supports the warrior values and the warfare process by ceremonially transforming the guilt of killing into self-righteous virtue and strength.20

As Keeley and subsequent critics note, of course, primitive warfare is less a question of pitched battles where, admittedly, few die, and more a question of raids and homicides. These, too, primitive cultures limit, employing cultural means like the Inuit song duel to provide a peaceful resolution to what otherwise might continue as an escalating blood feud.

Song-duels seem to have been especially popular among Eastern Arctic peoples, who used them as a way to resolve interpersonal conflict in a non-violent (at least, non-physically violent) fashion. The tradition was encouraged, not only because of its non-violent nature, but because it was great entertainment for the listeners.

The idea was very simple: each contestant would have a turn at inventing a song (sort of the Inuit equivalent of an evening at the improv) with lyrics that would humble, belittle, satirize, denigrate, revile, and generally humiliate the opponent.21

Alongside arguments like Keeley’s, one will often find discussions of torture and cannibalism in primitive societies. The 1917 Catholic Encyclopedia writes of the Iroquois, “Even among savages they were noted for their cruelty, cannibal feasts and sickening torture of captives being the sequel of every successful war expedition, while time after time the fullest measure of their awful savagery was visited upon the devoted missionary.”22 In France and England in North America, Francis Parkman describes Iroquois cannibalism:

They bound the prisoners hand and foot, rekindled the fire, slung the kettles, cut the bodies of the slain to pieces, and boiled and devoured them before the eyes of the wretched survivors. “In a word,” says the narrator [that is, the Algonquin woman who escaped to tell the tale], “they ate men with as much appetite and more pleasure than hunters eat a boar or a stag …”

As with so much else concerning primitive violence, this depiction is not so much wrong as vastly oversimplified and sensationally lurid.

The Iroquois believed that the death of an individual created a spiritual void which undermined the power of the entire community. Paralleling this community void was a personal void experienced by the friends and family of the deceased. If the village condolence rituals failed to assuage the grief of these individuals, the female relatives of the deceased had the right to call for a mourning war. When a mourning war was called, the village assembled a war party which was sent out in search of captives for adoption into the community. After one or more captives were taken from neighboring nations, the prospective adoptees were brought back to the village and beaten as they passed through a gauntlet composed of two rows of men, women, and children. The captives were then tortured, but not killed, by all members of the community. At this point, the torture was suspended and all the potential adoptees participated in a grand feast. At the end of the feast, the women of the family of the deceased determined the fate of the captives. If a captive was rejected, he or she was further tortured, killed, andoften eaten; if a captive was accepted, he or she was adopted into the family. The newly adopted member often took the name of the deceased individual whose death had initially triggered the mourning war. In most cases, male captives were not adopted because they represented both a flight risk and potential threat. Women and children who were adopted found themselves, according to most descriptions, genuinely welcomed into the community. In many cases, the children of female captives became village leaders, Pine Tree (war party) chiefs, and even chiefs of the Iroquois League.23

Such conflicts could easily spiral out of control; mourning wars led to the creation of the Iroquois League, the Huron Confederacy and the Neutral Confederacy. The decimation of the Native population from the cascading impacts of European contact led to a new round of escalating violence. The torture and cannibalism of captives was not as simple as providing meat or venting hatred, however.

But in the European sense, torture was a means to extract information or confessions from prisoners; in the Iroquois culture, torture seemed to be a way to strip the prisoners of their former lives so that they may be integrated into Iroquois society or killed for their spiritual power. Indeed, the entire ceremony surrounding the capturing, pre-torture festivities, the torture, and the post-torture all have deep cultural reasoning; the replacement the dead, or, at least, to ease the grieving of the mourners. …

Historical stereotypes of the Iroquois depict them as a war-loving peoples. This is not entirely the case. Before the heavy influence of and Iroquois dependence on European trade, warfare was almost exclusively for the purpose of ‘replacing’ the recently deceased with captured enemies. In the Iroquois war, taking as many captives was essential. Killing off an enemy or even suffering a few casualties “would subvert the purpose of warfare as a means of replacing the dead with captives.” As mentioned earlier, torture procedures began long before the actual torture. Captives were usually of non-Iroquois nations and villages, and usually those who were on bad relations with the Iroquois because of the lack of trade or political disagreements. After initial capture, the captives were abused during the journey back to the Iroquois village; there are conflicting reports as to the exact treatment of the captives, but both reports state that the hands, legs, feet, and especially fingers of the prisoners were greatly abused and mutilated. The captives were frequently forced to sing, as they were showered with gifts of food and beads and treated as guests of honor or esteemed clansmen and kindred. While being able to perceive the reasoning behind the capture of the prisoner , the French missionaries were not able to see the deeper meaning of the Iroquois kindness. The Iroquois religious, political, and social structures were all based on gift giving. To gain power and influence over a person or a group of people, one would have to give numerous material gifts away, usually beads, furs, or food. Therefore, by giving the captive material gifts, the Iroquois were demonstrating to him that they had absolute power over him; the power of life or death. “From the victim’s perspective, torture was at once a reminder of his captors’ dominance, a test of perseverance, (reflected in songs and in taunts hurled at the torturers), and an altered state of consciousness similar to a vision quest or a dream that provided contact with sources of spiritual power and paved the way for a new life.” …

The captives were ‘given’ to a family in mourning over the death of a person of kin, and this family chose whether to adopt or kill the captive. Relations contains stories of both end results. But either way, the captive had to endure hours of torture. Those Iroquois that conducted the torture was seemingly the entire town, but more specifically, it was the young men and the grieving women that tortured the captive. … The Iroquois continued to burn the captive’s feet, legs, and occasionally other body parts, but not to the degree as they had previously witnessed. During this time, they were calling the captive “uncle” in what the missionaries presumed to be mocking fashion. While it’s possible and probable they were mocking him in some fashion, the “uncle” was probably not part of this. In Iroquois longhouses, the father usually did not live with the children, and so the Uncle of the children filled the role of the father that Europeans were accustomed to; “Uncle” was a sign of respect, and if the captive was to be integrated into the Iroquois society, as was the intention and purpose of the torturing, he would probably be of an esteemed social rank because in this case, he was given to a chief or headsman. …

After the torturing of the captive was finished, by his premature death or otherwise, the family he was given to got to ultimately decide his fate; to kill or keep. If the captives were killed, they were cooked and eaten by the Iroquois village, thus they absorbed the captive’s spiritual power. If the captive were to be kept alive, his wounds were treated, and he became a member of Iroquois society as a member of the recently deceased; “I became aware that I was given in return for a dead man… causing the dead to become alive again in my person, according to their custom.” In the European sense, these adoptees were slaves, but in the Iroquois sense, they were members of their society with little to no political or social power; an infant. These adoptees could live dormantly as ‘lower class’ citizens or they could, using adopted Iroquois culture and techniques, rise through the political, social, or spiritual ranks to wield great power and influence over other Iroquois.

Through torture, Iroquois reflected their complex culture to their victims. The need or desire to keep the captives alive resulted in heroism for the Iroquois warriors. The gift giving ceremonies reflected the social and political system of the Iroquois. The decisions of the grieving women to keep the captives alive or to kill them off reflected women’s power within the society. The torturers’ taunts perversely reflected social life in the longhouses. If the captive died his spiritual power was absorbed, reflecting the Iroquois religion in a simplistic manner. And if the captive was adopted into an Iroquois family, he would be learning the culture from the ground up with the basis of what he learned through the torture; eventually revealing to him why he was taken captive in the first place: To replace a member of the recently deceased.24

Cannibalism also fulfilled the same goal: once the flesh of an enemy was consumed, he ceased to be an enemy, and was instead absorbed by the people of the village, making him a relative. Such practices can hardly be dismissed by invoking their ritual and cultural significance: they are still clearly violent acts. However, we can also see that our normal concepts of “slavery,” “cannibalism” and “torture” do not entirely fit such practices. Cannibalism appears somewhat frequently in human cultures, but it is never as simple as a matter of nutrition. Cannibalism always has significant cultural and religious taboos that limit and restrict the practice. In the case of the Haudenosaunee, such practices were used to compensate for the deep trauma of losing a close relative. The greatest achievements of the Haudenosaunee—like the Great Law of Peace—were dedicated to limiting such violence.

No society can ever be fully devoid of violence, but those that aspire to such a goal only become more violent by denying its place in the world. Primitive societies did engage in violence, and without a permanent class of professional killers, it fell to primitive peoples themselves to execute what violence became necessary. Perhaps that is in part why such societies also did so much to limit violence. Contemporary charges against primitive warriors rely on observations of a “post-apocalyptic” society decimated by European contact, ignoring the evidence that violence in these societies has been increased significantly because of the overwhelming impact of European contact. What we do see, however, is ample evidence of means to limit violence—emphasis placed on bravery and intimidation to avoid violence from breaking out, ritual approaches aimed at reconciling enemies, and alternative forms of contesting differences, such as song duels or counting coup. To properly compare the effectiveness of such approaches to our own, we need to take an honest accounting of violence in our own society—wars, murder, violent crime, incarceration, police brutality, and the full impact of our professional violence class. We need to look also to the ubiquitous violence inherent in our social system: the threat of violence that lies behind paying your rent, obtaining your food, and every other aspect of civilized existence. Primitive societies were not devoid of violence, but they did limit it, and it was a much rarer thing. Among them, violence was something that happened. For us, it’s a way of life.

The Honest Injun

In Othello, the honesty and forthrightness of the “noble Moor” contrasts sharply with Iago, an arch-deciever whose motives critics have continually tried to puzzle out, often coming to the conclusion that deception is simply Iago’s diabolical nature. The play thus sets up another key element of the “Noble Savage”: unflinching honesty, even an inability to lie or deceive. As Washington Irving writes in Captain Bonneville’s Narrative:

They were friendly in their dispositions, honest to the most scrupulous degree in their intercourse with the white man. (P. 200.) Simply to call these people religious would convey but a faint idea of the deep hue of piety and devotion which pervades their whole conduct. Their honesty is immaculate, and their purity of purpose and their observance of the rites of their religion are most uniform and remarkable. They are certainly more like a nation of saints than a horde of savages.

I On Cannibals (1580), Michel de Montaigne said of Brazilian tribes, “The very words that signify lying, treachery, falsehood, avarice, envy, detraction and pardon were unheard of among them.”

Such an assessment is, in one sense, hopelessly naïve, and simultaneously condescending; it suggests a child-like innocence that robs Native peoples of their autonomy by suggesting that they lack some of our most basic cognitive abilities, even if they are abilities we often put to ill use.

The Trickster tradition in primitive societies puts the lie to this notion. Even in horticultural villages, hunting is an essential activity, and hunter-gatherers are superb trackers, trappers and hunters. All of these require cunning, stealth, and yes, trickery. Most primitive societies offer significant honor to the Trickster, generally associated with clever predators or scavengers like Coyote or Raven. They tend also to place great honor on a good trick. Such things would not be possible without the capacity to lie.

However, primitive societies do tend to distinguish themselves from civilized peoples in the emphasis they place on honesty. Dr. Charles A. Eastman, born Ohiyesa (Santee Sioux), wrote The Soul of the Indian: An Interpretation in 1911. In chapter 4 (”Barbarism and the Moral Code”), he wrote, “It is said that, in the very early days, lying was a capital offense among us. Believing that the deliberate liar is capable of committing any crime behind the screen of cowardly untruth and double-dealing, the destroyer of mutual confidence was summarily put to death, that the evil might go no further.”

In the preceding section on violence, the “post-apocalyptic” nature of extant primitive societies was noted. This has enormous ramifications, including the collapse of primitive psychological and social life, what E. Richard Sorenson called “preconquest consciousness.” He notes that this presents a significant epistemological problem for anthropology.

Most anthropologists are aware that what comprise the standard habits, inclinations, and activities of humankind in one culture may seem quite exotic in another. When the separateness of peoples is extreme, incompatible modes of awareness and cognition sometimes arise, as occurred between the preconquest and postconquest eras of the world. Basic sensibilities, including sense-of-identity and sense-of-truth, were so contradistinctive in these two eras that they were irreconcilable. Even core features of life in one era were imperceptible to people in the other. While such disparate cognitive separation may be rare, a single occurrence is sufficient to make anthropology an epistemological problem.25

How can a Western anthropologist ever find an accurate reflection of primitive life, when the effects of Western contact precede anthropologists in the form of epidemics and cascading displacement, destroying societies before they can ever be encountered? Archaeology offers a limited window, but far more instructive are the rare moments when an anthropologist can find some enclave that has largely escaped such effects. That is precisely what Sorenson found.

Preconquest mentality emerged from a sociosensual infant nurture common to its era but shunned in ours. … In the isolated hamlets in the southern forests, infants were kept in continuous bodily contact with mothers or the mothers’ friends—on laps when they were seated, on hips, under arms, against backs, or on shoulders when they were standing. Even during intensive food preparation, or when heavy loads were being moved, babies were not put down. They had priority. …

Not put aside when work was being done, infants remained constantly in touch with the activities of life around them, their tiny hands ever reaching out to whatever items or materials were in use, and onto the hands, arms, and muscles of the users. In this way even as tiny babes-in-arms they began accumulating a kinesthetic familiarity with the implements and activities of life.

This familiarity, supplemented by a rapidly developing ‘tactile-talk,’ produced in toddlers an ability to manage objects and materials safely that might be dangerous elsewhere. When first sojourning in those southern hamlets, I was repeatedly aghast to see toddlers barely able to stand upright playing with fire, wielding knives, and hefting axes—without concern by anyone around. Yet they did not burn down their grass/bamboo abodes or chop off their toes and fingers. During all the years I spent within their communities, I never saw these babies hurt themselves while engaged in this type of independent exploration.

When tots explored outward, their antennae stayed tuned to the affect, mood, and musculature of those they left behind, thereby maintaining affective connection across space. With adults and older children constantly a source of gratification rather than obstruction, toddlers had no desire to escape from supervision. Even slight intimations of concern from those behind, such as a tensing of musculature, was enough to stop a baby in its tracks and cast about for cues. While mothers in many places feel within themselves the kind of pain that might be looming for their baby, it’s not so instantly perceived. Faster than any words of warning could be formed, these New Guinea tots were already responding. No words necessary. If some subtle ‘all-clear’ cue did not quickly come, the infant made fast tracks to ‘home base’: No reckless plunges onward, no furtive tricks to escape supervision.26

Sorenson made these observations among the South Fore of New Guinea, a region also noted for the incidence of kuru due to cannibalism. Jean Liedloff made similar observations about the Yequana in Venezuela in The Continuum Concept. There, Liedloff notes the importance of nearly constant physical contact in infants. The manner in which primitive children are raised includes points that make many current “experts” scream, such as cosleeping with parents, or breastfeeding fully to the age of five in some cases. In the book, Liedloff notes:

It is no secret that the “experts” have not discovered how to live satisfactorily, but the more they fail, the more they attempt to bring the problems under the sole influence of reason and disallow what reason cannot understand or control.

We are now fairly brought to heel by the intellect; our inherent sense of what is good for us has been undermined to the point where we are barely aware of its working and cannot tell an original impulse from a distorted one.

On her website, Liedloff contrasts the experience of primitive children with the earliest, most formative experiences of children in civilization:

  • traumatic separation from his mother at birth due to medical intervention and placement in maternity wards, in physical isolation except for the sound of other crying newborns, with the majority of male babies further traumatized by medically unnecessary circumcision surgery;*
  • at home, sleeping alone and isolated, often after “crying himself to sleep”;
  • scheduled feeding, with his natural nursing impulses often ignored or “pacified”;
  • being excluded and separated from normal adult activities, relegated for hours on end to a nursery, crib or playpen where he is inadequately stimulated by toys and other inanimate objects;
  • caregivers often ignoring, discouraging, belittling or even punishing him when he cries or otherwise signals his needs; or else responding with excessive concern and anxiety, making him the center of attention;
  • sensing (and conforming to) his caregivers’ expectations that he is incapable of self-preservation, is innately antisocial, and cannot learn correct behavior without strict controls, threats and a variety of manipulative “parenting techniques” that undermine his exquisitely evolved learning process.

* Note, “medically unnecessary” yet traumatizing practices are frequent in primitive cultures as well, though rarely so quickly after birth. For example, certain Australian Aboriginal peoples revere the emu, and so boys initiated into manhood have the underside of their penises sliced down to the urethra; it is then allowed to heal, to produce a ridge of scar tissue that makes it more resemble an emu’s penis.

Under such conditions, the distance between the psychological condition of modern Westerners and that found in “preconquest consciousness” becomes easier to understand. Fulfilling a child’s basic need for physical touch is the first initiation into reading subtle but expressive body language. We normally miss microexpressions27, the tensing of muscles, and other “subtle” cues that should be plainly evident, for the simple fact that most of us were not brought up in such a way. It’s almost like being deaf in the midst of a whole society of deaf people; none of us can really appreciate what it is we’re missing. Like Plato’s Allegory of the Cave, we can’t know what we’re missing when we’ve never glimpsed it. Instead, we learn speech as a disembodied medium; we communicate with one another in words, but lack any but the most obvious sensuous cues that underlie it. Social anthropologist Edward Hall estimated that 60% of our communication is non-verbal,28 even with this impediment. In primitive cultures, Sorenson and Liedloff reveal, it is significantly higher.

When babies began acquiring verbal speech, their words and sentences floated out atop a sophisticated body-language already well in place. Even after acquiring spoken language, tactile-talk continued taking precedence in much of daily life. It conveyed affect better. It was faster and more direct. Most of all it touched more deeply and more quickly into the hearts and minds of others. Tactile-talk was affect-talk. It integrated the spontaneous affect of individuals, often many at a time. So adept did young children become at this that they would at times merge actions into wordless synchrony.[9] With such rapport surrounding them tots could also safely enter into the rough-and-tumble play of older children. There were no games with rules, no formal skills to measure up to. Play was spontaneous, improvised, and exploratory, so small children were never in the way. Instead, their wide-eyed enthusiasm was a constant source of pleasure for the older children. The younger children were always welcome and were handled with intuitive regard and delight.

If some aggravation unwittingly occurred in the course of active play, it withered quickly within the collective empathy. Negative feelings thus faded before they had a chance to grow. Full-blown expressions of, for example, anger or sadness, were therefore very rare. That, too, contributed to the intuitive rapport that so delighted them.29

Group sleeping, expressive gestures, and the constant touch observed in primitive families serve to coordinate and synchronize each person with one another. The end result is an incredible efficiency and wordless commuication that takes place, providing very material advantages.

One day, deep within the forest, Agaso, then about 13 years of age, found himself with a rare good shot at a cuscus in a nearby tree. But he only had inferior arrows. Without the slightest comment or solicitation, the straightest, sharpest arrow of the group moved so swiftly and so stealthily straight into his hand, I could not see from whence it came.

At that same moment, Karako, seeing that the shot would be improved by pulling on a twig to gently move an obstructing branch, was without a word already doing so, in perfect synchrony with Agaso’s drawing of the bow, i.e., just fast enough to fully clear Agaso’s aim by millimeters at the moment his bow was fully drawn, just slow enough not to spook the cuscus. Agaso, knowing this would be the case made no effort to lean to side for an unobstructed shot, or to even slightly shift his stance. Usumu similarly synchronized into the action stream, without even watching Agaso draw his bow, began moving up the tree a fraction of a second before the bowstring twanged.

He grasped the wounded cuscus before it might regain its senses and slipped out onto a slender branch that whizzed him down to dangle in the air an inch or so before Agaso’s startled face. The startle had begun its standard transformation to ecstasy, when Usumu startled him again by provocatively dropping the quivering cuscus onto his naked foot, as he flicked a tasty beetle he’d found up in the tree into the pubis of delighted young Koniye (the youngest of the group). Doubly startled in quick succession, Agaso was wallowing in an ecstasy, then shared by all, until he abruptly realized that the cuscus might come back to life and dash off. Then in a mirthful scramble they all secured it.30

Such uncanny coordination can hardly exist in a context of deception, and with such close familiarity with subtle ticks, micro-expressions, and moods, deception would be impossible anyway. Sorenson almost begins to sound like Montaigne when he notes:

As detailed above, sociosensual child nurture spawned body language based on tactile exchanges of affect. Infants were quick to notice that the happiness of others made their own lives happier and richer, so they responded accordingly. Soon they realized that the more accurately and fully they conveyed their inner needs and interests, the more quickly rewarding responses were forthcoming. So they displayed true feelings without artifice, as openly and clearly as their tiny frames permitted. The more skilled they were, the happier they were; indeed, the happier were all.

Therefore ‘tactile-talk’ was ‘affect-talk,’ and ‘affect-talk’ was ‘truth-talk.’ It was so compelling that even after learning verbal speech children continued bouncing inner passions back and forth in ‘affect-talk.’ The messages were more emotionally rewarding. They moved more quickly and more accurately and were usually more deeply evocative. Spoken words did not have the same instant sensuality and were thus more remote from lives sentiently focused. Affect-talk was truth-talk because it only worked when personal feelings were above board and accurately expressed, which required transparency in aspirations, interests, and desires.

With body language based on full-time accurate truth, infants became candid and open, and remained so as they grew. When I first went into their hamlets I was astonished to see the words of tiny children accepted at face value—and so acted on. For months I tried to find at least one case where a child’s words were considered immature and therefore disregarded. No luck. I tried to explain the idea of lying and inexperience. They didn’t get my point. They didn’t expect prevarication, deception, grandstanding, or evasion. And I could find no cases where they understood these concepts. Even teenagers remained transparently forthright, their hearts opened wide for all to gaze inside.30

What seperates Montaigne and Sorenson here is two-fold:

  1. Sorenson has a mechanism. This is not simply glorifying primitive life; he has made observations, and has noted how this takes place. The deep, sensual and emotional synchronization of the community that helps them make a living for themselves precludes deceit within the group. They’re still perfectly capable of deception: on the hunt, trapping, or with outsiders.
  2. Sorenson has identified one domain where deceit is impossible: one cannot use “affect-talk” deceptively. Words can still deceive, because they exist at a symbolic distance, but the “affect-talk” of our micro-expressions, our muscles, and all the subtle, physiological cues of our current emotional state do not lie. In short, Sorenson is simply saying that primitives spend their entire lives attuning themselves to one another’s subtle cues. The resulting honesty is no more spectacular or supernatural than if we were to build an entire society where everyone was equipped with a polygraph machine. The only difference being that they have invested so much in one another that they do not need a machine.

Sorenson was also present to witness the catastrophic breakdown of this “preconquest consciousness.”

After loss of intuitive rapport, the sensually empathetic instincts governing sociosensual nurture became cruder and were less often on-the-mark. In large regions a grand cultural amnesia sometimes accompanied this collapse. Whole populations would forget even recent past events and make gross factual errors in reporting them. In some cases they even forgot what type and style of garments they had worn a few years earlier or (in New Guinea) that they had been using stone axes and eating their dead close relatives a few years back. Initially I thought they were dissimulating in an effort to ingratiate or appear up-to-date, but rejected this thought almost immediately. They were simply too unassuming and open in other respects for such a theory to hold up. And when I showed photographs I’d taken a few years earlier, they would brighten up, laugh, and eagerly call their friends as they excitedly began relating their reviving recollections.

The periods of anomie sometimes alternated with spates of wild excitement leading to a strange mixture of excess and restraint. It was during such disorders that abstract concepts of rights, property, and possession began emerging. So did formal names for people, groups, and places. These were then used argumentatively in defense of rights, property, and possessions. Negative emotions were applied to strengthen argument. Eventually they became structural aspects of society. As the art of political manipulation emerged, the selfless unity that seemed so firm and self-repairing in their isolated enclaves vanished like a summer breeze as a truth-based type of consciousness gave way to one that lied to live.31

Sorenson quotes his own field notes at the climax of the crisis: “Epidemic sleeplessness, frenzied dance throughout the night, reddening burned-out eyes getting narrower and more vacant as the days and nights wore on, dysphasias of various sorts, sudden mini-epidemics of spontaneous estrangement, lacunae in perception, hyperkinesis, loss of sensuality, collapse of love, impotence, bewildered frantic looks like those on buffalo in India just as they’re clubbed to death; 14 year olds (and others) collapsing on the beach, under houses, on the pier, in beached boats as well as those tied up at the dock, here and there, into wee hours of the morn, even on through dawn, in acute inebriation or exhaustion.” Over a span of a few weeks, Sorenson watched the “preconquest consciousness” he had observed violently collapse, to be replaced with the kind of socio-psychological environment ethnographers more commonly record.

Certainly, any notion that the capacity for deception in a modern Native American, African, Polynesian, or other “exotic race” differs from that of a modern European, is simply nonsense. Likewise, the idea that Europeans introduced the notion of deceit to groups that so revered the Trickster, is simply silly. That said, there is much more to the notion of the honest “Noble Savage” than we might otherwise be inclined to admit. Raised in traditional communities, humans develop a deep rapport with one another, what Derrick Jensen called a “language older than words.” In that language of microexpressions and subtle movement, deception is not possible. This sheds some light on the previous subject of warfare, as well. Primitive peoples delineated themselves not by “ethnicity,” but by language. The flip-side of the tenderness and closeness of the community that Sorenson observes is the brutal violence employed to protect that community. The deepness of one’s connection to the other members of their community made the shallowness of their connection with those they could only speak to all the more evident—and those who could not even speak the same language must have seemed even more distant. Some have puzzled over this seeming “dichotomy,” but it is, in fact, difficult to understand one without the other. The ferocity of primitive warfare wells up from the need to protect something so precious. Only when fighting for an ephemeral “State” to which one feels no true dedication can one abide by “rules of engagement,” rather than employ all one’s strength and bravery to make sure that no enemy ever threatens your family again.

The Super Human

Jean Jacques Rousseau, the philosopher most often credited with inventing the “Noble Savage,” wrote:

It is therefore no matter for surprise that the Hottentots of the Cape of Good Hope distinguish ships at sea, with the naked eye, at as great a distance as the Dutch can do with their telescopes; or that the savages of America should trace the Spaniards, by their smell, as well as the best dogs could have done; or that these barbarous peoples feel no pain in going naked, or that they use large quantities of pimento with their food, and drink the strongest European liquors like water.

Rousseau’s image of “man in the state of nature” has been no less overturned by anthropological observation than Hobbes’, but in Rousseau’s case, it is the lack of culture and thought that has suffered most from further inquiry. Rousseau’s “Noble Savage” conforms more closely to the Enlightenment’s notion of a “dumb beast,” blissfully ignorant.

More recently, John Zerzan has made similar arguments, as in Future Primitive.

DeVries has cited a wide range of contrasts by which the superior health of gatherer-hunters can be established, including an absence of degenerative diseases and mental disabilities, and childbirth without difficulty or pain. He also points out that this begins to erode from the moment of contact with civilization.

Relatedly, there is a great deal of evidence not only for physical and emotional vigor among primitives but also concerning their heightened sensory abilities. Darwin described people at the southernmost tip of South America who went about almost naked in frigid conditions, while Peasley observed Aborigines who were renowned for their ability to live through bitterly cold desert nights “without any form of clothing.” Levi-Strauss was astounded to learn of a particular [South American] tribe which was able to “see the planet Venus in full daylight,” a feat comparable to that of the North African Dogon who consider Sirius B the most important star; somehow aware, without instruments, of a star that can only be found with the most powerful of telescopes. In this vein, Boyden recounted the Bushman ability to see four of the moons of Jupiter with the naked eye.

In The Harmless People, Marshall told how one Bushman walked unerringly to a spot in a vast plain, “with no bush or tree to mark place,” and pointed out a blade of grass with an almost invisible filament of vine around it. He had encountered it months before in the rainy season when it was green. Now, in parched weather, he dug there to expose a succulent root and quenched his thirst. Also in the Kalahari Desert, van der Post meditated upon San/Bushman communion with nature, a level of experience that “could almost be called mystical. For instance, they seemed to know what it actually felt like to be an elephant, a lion, an antelope, a steenbuck, a lizard, a striped mouse, mantis, baobab tree, yellow-crested cobra or starry-eyed amaryllis, to mention only a few of the brilliant multitudes through which they moved.” It seems almost pedestrian to add that gatherer-hunters have often been remarked to possess tracking skills that virtually defy rational explanation.32

The fact that the story of the Dogon has been proven false33 does not help the credibility of the rest, but there is certainly no small list of the physical feats that have been ascribed to hunter-gatherers. And why not? We marvel at a cheetah’s speed, a cat’s grace, a hawk’s vision, and so on. Does it really stand to reason that humans were, as the Greeks concluded in their myth of Prometheus and Epimetheus, simply dealt out by evolution? We evolved in Africa, a hyper-competitive continent ruled by “super-predators.” How did we survive if our physical forms are as weak and frail as we assume?

We know that civilization makes us sick, and that we have not developed any superior medicine to balance that out. Humans never evolved to eat grains, yet we eat them almost exclusively. Even people who remain entirely civilized in all other respects but adopt a “Paleo Diet” eschewing grains note remarkable improvements in health. What should surprise us would be anything else. Humans evolved as hunter-gatherers; wouldn’t it make sense for us to be healthy in our evolutionary niche, and for our health to suffer when we are not? Archaeologists have noted a “Neolithic Mortality Crisis,” during which longevity dropped off precipitously by half or more. This is dramatically illustrated by the finds at Dickson’s Mounds.33

In his classic”The Worst Mistake in the History of the Human Race,” Jared Diamond outlined the manner in which civilization has been the greatest public health disaster of all time.

One straight forward example of what paleopathologists have learned from skeletons concerns historical changes in height. Skeletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5’ 9″ for men, 5’ 5″ for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5’ 3″ for men, 5’ for women. By classical times heights were very slowly on the rise again, but modern Greeks and Turks have still not regained the average height of their distant ancestors.

Another example of paleopathology at work is the study of Indian skeletons from burial mounds in the Illinois and Ohio river valleys. At Dickson Mounds, located near the confluence of the Spoon and Illinois rivers, archaeologists have excavated some 800 skeletons that paint a picture of the health changes that occurred when a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150. Studies by George Armelagos and his colleagues then at the University of Massachusetts show these early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced bya bone condition called porotic hyperostosis), a theefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor. “Life expectancy at birth in the pre-agricultural community was bout twenty-six years,” says Armelagos, “but in the post-agricultural community it was nineteen years. So these episodes of nutritional stress and infectious disease were seriously affecting their ability to survive.”34

Stefansson tried to find evidence of cancer among the Inuit. He began his search in 1884, and it took him 49 years to find a single confirmed case in 1933, though he described a possible, unconfirmed case in 1900 as well. Others have described a possible case of cancer in a 500-year-old Inuit mummy. Many doctors who have worked in primitive settings have described the same phenomenon: they marveled that they never encountered cases of cancer, and though they would not go so far as to say it didn’t exist, they concluded that it must have been astonishingly rare.35

All of this points to the shocking conclusion that when the human body is used in its proper evolutionary context, it is capable of the same amazing feats as other animals. This does not mean that primitive people enjoy perfect health; they do, after all, have very effective medical systems of their own, and why would those exist if they enjoyed perfect health? But it is easy to see that civilization has been a disaster for human beings in terms of health. Rather than looking on primitive people as superhuman, we should see them as simply healthy human beings. The gap comes from the fact that even the healthiest of us have been chronically ill for our entire lives.

Beyond this, the “affect-talk” that primitive peoples grow up with is not limited to simply Homo sapiens. Once such subtlies can be read as completely as speech itself, communication with other animals becomes possible, as well. David Abram describes the emergence of human language from non-human communication in The Spell of the Sensuous.

Hunting, for an indigenous, oral community, entails abilities and sensitivities very different from those associated with hunting in technological civilization. Without guns or gunpowder, a native hunter must often come much closer to his wild prey if he is to take its life. Closer, that is , not just physically but emotionally, empathically entering into proximity within the other animal’s ways of sensing and experiencing. The native hunter, in effect, must apprentice himself to those animals that he would kill. Through long and careful observation, enhanced at times by ritual identification and mimesis, the hunter gradually develops an instinctive knowledge of the habits of his prey, of its fears and its pleasures, its preferred foods and favored haunts. Nothing is more integral to this practice than learning the communicative signs, gestures, and cries of the local animals. Knowledge of the sounds by which a monkey indicates to the others in its band that it has located a good source of food, or the cries by which a particular bird signals distress, or by which another attracts a mate, enables the hunter to anticipate both the large-scale and small-scale movements of various animals. A familiarity with animal calls and cries provides the hunter, as well, with an expanded set of senses, an awareness of events happening beyond his field of vision, hidden by the forest leaves or obscured by the dark of night. Moreover, the skilled human hunter often can generate and mimic such sounds himself, and it is this that enables him to enter most directly into the society of other animals.

Thus, the whole world becomes the extended senses of the hunter. This has been thoroughly noted in other animals: they listen to the alarm cries of other animals around them, and know that those calls mean that danger is approaching. Even plants eavesdrop on one another.36 There is a pan-specific forum going on with all the furor of a crowded marketplace in every vibrant ecology, but we’re to believe that Homo sapiens alone is uniquely cut out of the loop? Rather, we’ve simply learned to be the misers of empathy, and by withholding the notion of intelligence in the living world around us, we tune out what it has to say. Abram goes on to detail how the development of the alphabet made us deaf to the world around us. Oral cultures do not share that deficit. Part of their extraordinary senses is the fact that they do not rely on their own senses alone: the whole world serves as their eyes and ears.

Finally, primitive cultures do not segregate the world as we do; they are synaesthetes. We retain a significant measure of this, though we do not train it. We speak of “loud” colors, or “bright” sounds. As Richard Manning relates in Against the Grain:

There are beings, many of them human beings, that see, smell, hear, remember, sense more than we do. This is not a genetic accident, like being taller than six-foot-five or having an IQ of 150 or high cheekbones. This is a matter of culture. The human beings who maintain these hyper-refined senses are hunter-gatherers. Their impressive powers of perception have been noted and detailed by just about every student of hunter-gatherer groups. It is not only that they sense more than the rest of us do, but that they do so in a qualitatively different fashion. … The term “synaesthesia” describes something every child knows. In fact, Merleau-Ponty believes that we have “unlearned how to see, hear, and, generally speaking, to feel.” Synaesthesia is the mental function (or suite of functions) in which the senses run together, in which colors have a feel to them and tastes have a color. We speak of a loud shirt, of bright music, yet how often do we sense reality this way? For Abram and other observers, the phenomenon marks a total immersion in sense, when the observer is no longer in control, no longer separating and analyzing sight, sound, and texture, and becomes a part of his sensual surroundings. That is, the observer calls forth the world.

The Wise Indian

At this point, we have already seen that before the “apocalypse” of European contact, primitive societies the world over possessed a deep, sensual/emotional bond with their communities that we would naïvely call “telepathic,” the ability to effectively speak with other animals, and a relationship with the living world around them that turns it into an extended part of one’s self, an exterior set of senses as critical as sight or hearing. Moreover, we’ve seen at each point along the way that these seemingly extraordinary things are quite unremarkable and even common in most non-human species; in fact, what would be incredible would be a scenario in which Homo sapiens successfully evolved into the present without these things.

Of course, we can just as easily notice that we First Worlders at the beginning of the twenty-first century lack all of these things. They seem like extraordinary claims, to suggest that humans were not uniquely neglected by evolution, but in fact possesses all these faculties that we have never experienced. Part of the reason for this has already been discussed by Jean Liedloff as a consequence of how we are raised. Another part lies in our lifestyle, which neglects our hunter-gatherer evolutionary niche and feeds us a steady diet of cereal grains and other foods we’re ill-adapted to, with varying degrees of toxicity. Another major element of our loss was outlined by David Abram in The Spell of the Sensuous: the radical disconnection from the living world that we experience when we learn to read.

Abram notes that in oral cultures, language is not the simply arbitrary medium suggested by modern linguists. Rather, indigenous languages are shaped by the sounds and interactions of the world around them. One particularly striking example comes from the Koyukon language.

The Artic tern (k’idagaas’), the northern phalarope (tiyee), the rusty blackbird (ts’uhutlts’eegga), the blackpoll warbler (k’oot’anh), the slate colored junco (k’it’otlt’ahga)—all have such names. Written transcription, however, cannot convey the remarkable aptness of these names, which when spoken in Koyukon have a lilting, often whistle like quality. The interpenetration of human and nonhuman utterances is particularly vivid in the case of numerous bird songs that seem to enunciate whole phrases or statements in Koyukon.

Many bird calls are interpreted as Koyukon words … what is striking about these words is how perfectly they mirror the call’s pattern, so that someone outside the tribe who knows birdsongs can readily identify the species when the words are spoken in Koyukon. Not only the rhythym comes through, but also some of the tone, the “feel? that goes with it.

As we ponder such correspondences, we come to realize that the sounds and rhythyms of the Koyukon language have been deeply nourished by these nonhuman voices.

Hence the whirring, flutelike phrases of the hermit thrush, which sound in the forest thickest at twilight, speak the Koyukon words sook’eeyis deeyo—”it is a fine evening.” The thrushes also sometimes speak the phrase nahutl-eeyh—literally, “a sign of the spirit is perceived.” The thrush first uttered these words in the Distant Time, when it sensed a ghost nearby, and even today the call may be heard as a warning.

Similarly, he relates the story of Manuel Cόrdova-Rios, who learned the language of the Amahuaca in the Amazon by learning the sounds of the jungle.

One of the most revealing twentieth-century accounts of a relatively intact indigenous community is that recorded by F. Bruce Lamb from the spoken recollections of the Peruvian doctor Manuel Cόrdova-Rios. Cόrdova-Rios was captured in 1907, when he was fifteen years old, by a small tribe of Amahuaca Indians living deep in the Amazonian rain forest (between the headwaters of the Juruá, Purús, Madre de Dios, and Inuya rivers)—probably the remnant of a larger tribe decimated by the incursion of the rubber-tapping industry into the forest. He was carefully trained by the headman of this small tribe to become his successor, and was for six years meticulously tutored in the ways of the hunt, in the medicinal and magical powers of the rain forest plants, and in the traditional preparation and use of the extracts from the ayahuasca vine to attain, when necessary, a clairvoyant state of fusion with the enveloping jungle ecosystem.

Curiously, the tribe’s language, which remained largely meaningless to Cόrdova-Rios for six months or more, became understandable to his ears only as his senses became attuned to the subtleties of the rain forest ecology in which the culture was embedded. He did, eventually, become headman of the tribe, yet he fled the rain forest the following year after a series of attempts on his life by a neighboring band.

Cόrdova-Rios’s descriptions of the various hunts in which he participated make vividly evident the extent to which these people’s senses were directly coupled to the enveloping forest:

They reacted to the faintest signals of sound and smell, intuitievely relating them to all other conditions of the environment and then interpreting them to achieve the greatest possible capture of game … Many of the best hunters seemed to know by some special extra sense just where to find the game they sought, or they had developed some special methiod of drawing game to them. Knowing how to imitate and to use the signals the animals made to communicate between their kind in various situations helped in locating game and drawing it within sighting range of an astute hunter.

In the course of Cόrdova-Rios’s account, we read careful descriptions of hunters sequestered in the foliage of high fruit trees luring partridges toward them with mimicked bird calls signaling the discovery of an abundant food source. We read of one hunter who, upon hearing a band of monkeys moving thought the dense forest canopy overhead, utters a cry that would be made by a baby monkey if it had fallen to the ground. This call stops the roving monkeys and brings them down beneath the thick foliage into the hunter’s arrow range; the hunter shoots tow of them to feed his family. Later Cόrdova-Rios’s native comrades teach him, through imitation, the principal vocal signals of a species of wild pig that they are hunting.

Thus, Abram shows, human language helped bind human communities into the larger ecology, and helped them relate to the non-human communities they lived with and depended upon. As we have already seen, such relationships are crucial for the survival of all species. In its origin, Abram writes, writing had the same function.

Writing, like human language, is engendered not only within the human community but between the human community and the animate landscape, born of the interplay and contact between the human and the more-than-human world. The earthly terrain in which we find ourselves, and upon which we depend for all our nourishment, is shot through with suggestive scrawls and traces, for the sinuous calligraphy of rivers winding across the land, inscribing arroyos and canyons into the parched earth of the desert, to the black sash burned by lightning into the trunk of an old elm. The swooping flight of birds is a kind of cursive script written on the wind; it is this script that was studied by the ancient “augurs,” who could read there in the course of the future. Leaf-miner insects make strange hieroglyphic tabloids of the leaves they consume. Wolves urinate on specific stumps and stones to mark off their territory. And today you read these printed words as tribal hunters once read the tracks of deer, moose, and bear printed in the soil of the forest floor. Archaeological evidence suggest that for more than a million years the subsistence of humankind has depended upon the acuity of such hunters, upon their ability to read the traces—a bit of scat here a broken twig there—of these animal Others. These letters I print across the page, the scratches and scrawls you now focus upon, trailing off across the white surface are hardly different from the footprints of prey left in the snow. We read these traces with organs honed over millennia by our tribal ancestors, moving instinctively from one track to the next, picking up the trail afresh whenever it leaves off, hunting the meaning which would be the meeting with the Other.

But writing also opened up the possibility of closing off human speech, and that is precisely what happened. The first writing were ideograms, direct representations of the ideas they expressed. The Chinese alphabet works like this. Later, rhebuses were used, such as drawing a bee and a leaf to represent “belief.” This was a critical step, because the symbols now referred not to the thing itself, but to the human sound signifying it. The Hebrew alphabet operates like this; the letters each refer to actual things in the sensuous world, so there is still a point of reference in the living world around us. The meanings read into the letters in Kabbalah, for example, exemplify the richness this carries through. Perhaps even more importantly, the Hebrew alphabet does not have any vowels. The written word must still be engaged, it must be brought to life with sounded breath. When such alphabets reached the Greeks, however, the symbols lost their meaning; extra letters that did not correspond with any Greek phonemes were instead used to represent vowels for the first time. With the Greek alphabet, the transition was complete: writing had become a completely human discourse.

Abram notes that Socrates and Plato lived at the time that writing was introduced as a basic part of Greek education; Socrates just before, and Plato just after. So Alfred Whitehead’s proclamation—”The safest general characterization of the European philosophical tradition is that it consists of a series of footnotes to Plato”—could be true in that Plato began to follow through the ideological ramifications of writing.

Everything that we speak of as Western civilization we could speak of as alphabetic civilization. We are the culture of the alphabet, and the alphabet itself could be seen as a very potent form of magic. You know, we open up the newspaper in the morning and we focus our eyes on these little inert bits of ink on the page, and we immediately hear voices and we see visions and we experience conversations happening in other places and times. That is magic!

It’s outrageous: as soon as we look at these printed letters on the page we see what they say. They speak to us. That is not so different from a Hopi elder stepping out of her pueblo and focusing her eyes on a stone and hearing the stone speak. Or a Lakota man stepping out and seeing a spider crawling up a tree and focusing his eyes on that spider and hearing himself addressed by that spider. We do just the same thing, but we do it with our own written marks on the page. We look at them, and they speak to us. It’s an intensely concentrated form of animism. But it’s animism nonetheless, as outrageous as a talking stone.

In fact, it’s such an intense form of animism that it has effectively eclipsed all of the other forms of animistic participation in which we used to engage—with leaves, with stones, with winds. But it is still a form of magic.36

Walter Ong’s classic Orality & Literacy discusses how the transition from orality to literacy transforms the world from a socially-constructed world of events, to an objective world of things.

Though words are grounded in oral speech, writing tyrannically locks them into a visual field forever. A literate person, asked to think of the word “nevertheless,” will normally (and I strongly suspect always) have some image, at least vague, of the spelled-out word and be quite unable ever to think of the word “nevertheless” for, let us say, 60 seconds, without adverting to any lettering but only to the sound. This is to say, a literate person cannot fully recover a sense of what the word is to purely oral people.

By comparison:

In a primary oral culture where no one has ever ‘looked up’ anything is an empty phrase: it would have no conceivable meaning. Without writing, words have no visual presence, even when the objects they represent are visual. They are sounds. You might “call” them back—”recall” them. But there is nowhere to “look” for them. They have no focus and no trace (a visual metaphor, showing dependency on writing), not even a trajectory. They are occurrences, events.

This leads to extremely different approaches to the rest of the world. The “thing-like” universe of literacy encourages critical, logical thought and a reductionistic, mechanistic expectation of a dead, “clockwork” world. On the other hand, the “event-like” universe of orality encourages an appreciation of a holistic approach, expecting complex systems with emergent properties and self-regulating feedbacks that can be approached as other persons.

For example, cultural anthropologists have found that preliterate cultures often do not systematically form thoughts with individual building-block words as we do in the West. The preliterate thought process is largely holistic (as is music), unlike the Western tendency to separate word from thought and alphabet letter from word, forming a hierarchal structure where, as McLuhan says, “semantically meaningless letters are used to correspond to semantically meaningless sounds.” McLuhan and others have gone on to show how this separation of symbol from its meaning relates to the advent of movable type printing and is essential to the fostering of the industrial-technological-scientific society: this extension and separation of the symbol from direct contact with that being symbolized enabled it to become modular and reproducible en masse, resulting in, for example, a car part made in Japan for production in Detroit, a scientific journal in one city being understood in another distant laboratory without direct contact, or a sonata written in one location performed in another by another individual.37

We can start to see how the ramifications of writing form the core of Plato’s philosophy.

Platonism has traditionally been interpreted as a form of metaphysical dualism, sometimes referred to as Platonic realism, and is regarded as one of the earlier representatives of metaphysical objective idealism. According to this reading, Plato’s metaphysics divides the world into two distinct aspects: the intelligible world of “forms”, and the perceptual world we see around us. The perceptual world consists of imperfect copies of the intelligible forms or ideas. These forms are unchangeable and perfect, and are only comprehensible by the use of the intellect or understanding, that is, a capacity of the mind that does not include sense-perception or imagination. This division can also be found in Zoroastrian philosophy, in which the dichotomy is referenced as the Minu (intelligence) and Giti (perceptual) worlds. The Zoroastrian ideal city, Shahrivar, also exhibits certain similarities with Plato’s Republic. The existence and direction of influence here is uncertain; while Zoroaster lived well before Plato, few of the earliest writings of Zoroastrianism survive unaltered.38

Yet even Plato himself showed some reticence that this might not be the best course. The Phaedrus ruminates on the limitations of writing, including this story, about an Egyptian king named Thamus being offered various gifts from Thoth, the Egyptian god of wisdom and invention:

But when they came to letters, “This,” said Thoth, “will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit.”

Thamus replied: “O most ingenious Thoth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

This discernment destroys the synaesthesia common in primitive societies, and the abstraction of the alphabet alienates us from the living world. Abram writes:

From an animistic perspective, the clearest source of all this distress, both physical and psychological, lies in the violence uselessly perpetrated by our civilization on the ecology of the planet; only by alleviating the latter will we be able to heal the former. This may sound at first like a simple statement of faith, yet it makes eminent and obvious sense as soon as we acknowledge our thorough dependence upon the countless other organisms with whom we have evolved. Caught up in a mass of abstractions, our attention hypnotized by a host of human-made technologies that only reflect us back upon ourselves, it is all too easy for us to forget our carnal inherence in a more-than-human matrix of sensations and sensibilities. Our bodies have formed themselves in delicate reciprocity with the manifold textures, sounds, and shapes of an animate Earth; our eyes have evolved in subtle interaction with other eyes, as our ears are attuned by their very structure to the howling of wolves and the honking of geese. To shut ourselves off from these other voices, to continue by our life-styles to condemn these other sensibilities to the oblivion of extinction, is to rob our own senses of their integrity, and to rob our minds of their coherence. We are human only in contact and conviviality with what is not human. Only in reciprocity with what is Other do we begin to heal ourselves.

Herein lies the great spiritual crisis of civilization that so many philosophers, thinkers, and religious leaders have pondered and pained over, as well as the problems of the “plastic medicine man” that lie at the heart of this last aspect of the “Noble Savage.” Abram writes again:

Today, in the “developed world,” many persons in search of spiritual self-understanding are enrolling for workshops and courses in “shamanic” methods of personal discovery and revelation. Meanwhile psychotherapists and some physicians have begun to specialize in “shamanic healing techniques.” “Shamanism” has come, thus, to denote an alternative form of therapy; the emphasis, among these new practitioners of popular shamanism, is on personal insight and curing. These are noble aims, to be sure, yet they are, I believe, secondary to and derivative from the primary role of the indigenous shaman, a role that cannot be fulfilled without long and sustained exposure to wild nature, its patterns and vicissitudes. Mimicking the indigenous shaman’s curative methods without knowledge of his or her relation to the wider natural community cannot, if I am correct, do anything more than trade certain symptoms for others, or shift the focus of disease from place to place within the human community. For the source of stress lies in the relation between the human community and the living land that sustains it.

In oral societies, shamans exist to tend the boundaries between human and non-human communities, as ambassadors and negotiators with the non-human world. The “shamanic state of consciousness” that captivates Western imagination (as with the drug culture fascinated by the egregious hoax of Carlos Castenada) is merely a tool to this end. Michael Winkelman’s work has shown how deeply shamanism is tied into the structures of the human brain,39 and we can further understand shamanism’s effectiveness in terms of the kind of “thin slicing” Malcolm Gladwell discusses in Blink, if we bear in mind that shamanism actively cultivates such techniques, so that the gap that opens up is similar to someone tapping out a melody on his glass at a restaurant, compared to a world-class symphony. But most importantly, traditional shamanism exists only as part of a tribe, and as part of a larger relationship that binds family and land.40 The “plastic medicine men” of contemporary cultural appropriation, in a desperate search for something real and authentic, instead perform a final kind of cultural theft to follow the genocide and decimation of Native peoples by robbing them of what culture they have left.

The Noble Savage provides a fantasy for Euro-Americans wishing to escape dilemmas of their own culture. Imitation of Native Americans and other appropriations of their identity have often accompanied this romanticization. In “The Tribe Called Wannabee: Playing Indian in America and Europe,” Cherokee scholar Rayna Green does an excellent job of tracing this historical phenomenon of “playing Indian” from the Boston Tea Party to YWCA sponsored “Indian princess” programs. …

Despite the New Agers’ professions that they are working toward social and cultural change, their commercialization of Native American spirituality articulates well within late-twentieth-century consumer capitalism. There is strong historical and social evidence that the commercialization of ideas and values, as well as the fetishized image of a social body perceived to be ethnically Other, stems in part from thought and practices produced within the context of recent consumer capitalism. Although the New Age spiritualists identify themselves as countercultural, their uncritical ideas about commercialization and marketing practices appear to have been shaped by the larger capitalist market economy. Moreover, their imperialistically nostalgic fetishization of Native American spirituality hinders any recognition of their own historical and social complicity in the oppression of indigenous peoples.41

Such cultural appropriation has been ongoing for some time for the simple reason that Europeans have little else to turn to. Even before the Boston Tea Party, the tradition of “going native” was alive and well. As J. Hector St. John de Crevecoeur wrote in his Letters from an American Farmer:

There must be in the Indians’ social bond something singularly captivating, and far superior to be boasted of among us; for thousands of Europeans are Indians, and we have no examples of even one of those Aborigines having from choice become Europeans! There must be something very bewitching in their manners, something very indelible and marked by the very hands of Nature. For, take a young Indian lad, give him the best education you possibly can, load him with your bounty, with presents, nay with riches, yet he would secretly long for his native woods, which you would imagine he must have long since forgot; and on the first opportunity he can possibly find, you will see him voluntarily leave behind all you have given him and return with inexpressable joy to lie on the mats of his fathers.

Benjamin Franklin offered a similar assessment:

No European who has tasted Savage Life can afterwards bear to live in our societies. … The Care and Labour of providing for Artificial and fashionable Wants, the sight of so many Rich wallowing in superfluous plenty, whereby so many are kept poor and distress’d for Want, the Insolence of Office … the restraints of Custom, all contrive to disgust them with what we call civil Society.

The most powerful element of the primitivist lure is also the most difficult to prove: the yawning void that every civilized person intuits, the feeling that something vital is missing, that we’ve led our entire lives as pale shadows of what we were born to be. The “wise, old Indian” element of the “Noble Savage” stereotype reflects this, the notion that somewhere along the line, civilization lost some essential knowledge of how to live in the world, and the hope that maybe not every culture did.

Today, decimated by the genocide perpetrated by our forefathers, Native peoples live lives terribly similar to our own. Whatever wisdom they possessed was not some innate quality of their blood, but a product of the culture they were raised in—a culture based on a way of life that’s no longer possible. They still retain the memories and teachings of those cultures, and they remain some of the best clues we have of what a healthy human culture looks like, but no mere spiritual belief or religious rite can undo ten thousand years of destruction and genocide.

To reclaim what we’ve lost means reconciling our relationship with the living earth, and grounding ourselves once again in close-knit communities, and in the non-human communities that create us and sustain us. Divorced from that, we are cut off from the very source of our well-being and strength. Once again, we should marvel less at the “wise old indian,” and more at how much we have lost.

But cultural appropriation offers a false hope. We need to learn from those who went before us, but we also need to find our own way, create our own communities as we reconcile ourselves with the living earth. Permaculture can help ground us and begin that reconciliation.42 Anything that encourages us to think as animists, to “return to our senses,”43 can help as well. Daniel Noel has suggested Merlin as a model for a culturally-honest, Western neoshamanism.44 Other shreds survive, even in our civilization. We have enough to begin building a new, sustainable society, but it must be syncretic. Simply stealing the traditions of others won’t give us a way to live; it’s just one more theft in a long history of theft and genocide.

Conclusion

The “Noble Savage” has long been the straw man beaten by those who would hope to continue beating the drums of war and empire, just as Crawfurd and Hunt did, the white supremacists who revived the term as we have it today. In our Romantic fervor, we take it entirely too far. Primitive peoples have an impact on their environment, it’s just a positive one. They fight, they simply fight less. They deceive, they simply have communities and ways of relating where deception is impossible. They get sick, just less often. They’re still human, they just know what that truly means. At times, the “Noble Savage” seems to make primitive people out to be perfect in every way. That’s absurd. They are still people. What differs is that they still remember what being a person entails. It’s not a perfect life—it’s just a vast improvement.

This should hardly surprise anyone. As Daniel Quinn wrote in Beyond Civilization:

The tribal life and no other is the gift of natural selection to humanity. It is to humanity what pack life is to wolves, pod life is to whales, and hive life is to bees. After three or four million years of human evolution, it alone emerged as the social organization that works for people.

In the same volume, Quinn points out why this is no cause for any surprise; in fact, we should be shocked if it happened any differently.

Natural selection is a process that separates the workable from the unworkable, not the perfect from the imperfect. Nothing evolution brings forth is perfect, it’s just damnably hard to improve upon.

Ten thousand years ago, humans began living in a radically different way. We’ve been adapting the whole time, trying to make it work, but this way of life is so antithetical to our evolved nature, so opposite of everything that’s gone before, that it will take another hundred thousand years or more before we’re done.

But, as to be expected given the term’s history, those seeking to dispel the myth are responsible for perpetrating far more egregious myths. As we’ve seen, the “Noble Savage” as a stereotype is undoubtedly a shallow, Romantic exaggeration. At the same time, each aspect of it has roots in some aspect of human life that civilization has lost. Throughout history, some measure of primitivism has accompanied the march of ever-greater complexity.

In his monumental survey of Western culture over the past five centuries From Dawn to Decadence, Jacques Barzun records that a yearning for what he calls “primitivism? has been a powerful force over the whole of this time. Primitivism is one of the key impulses that inspired those who were drawn to Martin Luther and the Protestant Reformation. Barzun notes that this impulse is based on the perception that complex social systems are both oppressive and corrupt. Hence it repeatedly creates a demand for “the pure,? not only in religion and social organization, but across the whole of the culture: “pure love, pure thought, pure form in art.? It is the principal impulse behind the notion of the “noble savage.? Primitivism, Barzun argues, is closely related to the demand for emancipation. Together, the rejection of complex, rule-based society and the yearning for an earlier, simpler, more natural state of humanity, have regularly produced calls for the overthrow of existing social structures, in both the church and the polity.45

The silly notion of “purity” that could never work in a truly primitive life aside, this is largely true. Most of the great movements in history were essentially primitivists; Jesus, Martin Luther, and many, many others held essentially primitivist views, and sought to push back the overwhelming, dehumanizing complexity of civilization back. Today, we know that civilization is defined by its unhealthy relationship with complexity,46 and that complexity is subject to diminishing returns.47 In that pattern, civilization damns itself to collapse. Primitivism makes sense of that pattern, and offers hope for the future by throwing light on just how dehumanizing civilization is.

Those who beat the drum most loudly against the “Noble Savage” are also those with the most religious zeal—not necessarily in Christianity or any other conventional religion, but in progress.48 A fine example can be found in Bruce Thornton’s Plagues of the Mind:

By denigrating Western civilization—the imperfect but still best hope for controlling humanity’s penchant for evil and for providing the greatest freedom for the greatest number of people—the myth of the noble savage nurtures the false hope that human perfection and freedom are possible without civilization.

What is most interesting here is that in this volume dedicated to seperating “good knowledge” form “bad knowledge,” Thornton preaches so dogmatically about how important it is to never question the importance of civilization. We must ignore the fact that is has increased the level of violence in our world from an occasional and limited affair to an ubiquitous way of life, the way it has decimated the living world and brought on a new mass extinction, the way it has institutionalized deceit, treachery and alienation, and the way it has even broken us down as human beings, stripped us of our health, our senses, and ultimately, even our personhood and dignity. This must never be questioned, Thornton urges. We must not doubt civilization; we must maintain our faith.

Thornton’s “Plagues of the Mind” invokes the same metaphor once employed by the Catholic Church. John Chrysostom wrote:

Heretics are afflicted in a similar fashion as are those who labor under a disease and who are physically blinded; the latter because of the weakness of their eyes, resist the light of the sun, and, because of their poor health, they refuse even the best and most healthful foods, whereas heretics, being sick in spirit and blinded in their mental vision, cannot look at the light of truth.

The preachers of progress today have become as shrill as the Inquisitors of old. The “disease of the mind” inevitably spreads, prompting us to finally question recieved truths, when such grand projects fail to fill the void we still feel gnawing at us, reminding us that we have lost something essential, something crucial to the human condition. We intuit that the loss has diminished us, and many have tried to fill it. The Church called it “original sin,” and promised that salvation would fill that void. It failed. The Enlightenment simply offered another view of salvation, from another source—Reason, rather than Jesus.

The irony, of course, is that neither in Voltaire’s time nor in ours has social change actually happened that way. The triumph of the Enlightenment itself did not happen because the social ideas circulated by its proponents were that much better than those of their rivals; it happened because the core mythic narrative of the Enlightenment proved to be more emotionally powerful than its rivals. That narrative, of course, is the myth of progress, the core element of the worldview that has made, and now threatens to destroy, the modern world.49

The myth of progress is one far more deserving of our scorn. Where is the evidence for it? Our medicine,50 our knowledge,51 nor even our art52 can truly be said to have advanced beyond what it was 10,000 years ago. Yet for this way of life we suffer an inferior quality of life, even by our own skewed standards.53

The myth of the “Noble Savage” is a Romantic exaggeration, but it echoes an important truth. Primitive life is not perfect, but it’s “damnably hard to improve upon.” It alone is “the gift of natural selection to humanity,” the only way of life proven in the long run to work for people, the only proven sustainable society we’ve ever known. It may be the perspective of the chronically ill looking on at the merely healthy, like Plato’s Allegory of the Cave, but we should remember that the seemingly extraordinary gifts of primitive life are not because they are superhuman. Rather, they are human, and all that it entails. We’re the ones that have diverged from that. We’ve lost our most basic human birthright. The destruction we wreak upon the earth and upon ourselves is not born of our malice, but of our pain and neediness.

We don’t need to sacrifice; we’ve already sacrificed too much. We need to demand more. We need to take our humanity back.

Trackbacks & Pingbacks

  1. […] Jason Godesky har skrivit en lång och briljant artikel om myten om den ädla vilden, och på vilka sätt den är sann respektive osann. Artikeln har vuxit till ett slags samlat manifest där han knyter ihop trådar från sina äldre artiklar om ursprungliga kulturer. Några smakprov följer: Horticultural techniques used by Native groups (many of them are now employed by permaculturalists) encouraged higher levels of succession, with species that supported human prosperity while simultaneously encouraging the biodiversity and health of their ecosystem. Rather than putting the two at odds like a modern logger, horticultural techniques put human welfare and ecological health in tandem. … they took their responsibilities seriously: to give back to the land more than they took from it. … The “Noble Savage” is not ennobled by any kind of innate moral superiority, but by a way of life that fundamentally works. Ethical injunctions always fail, because they pit a bad conscience against self-interest. Working societies channel self-interest in directions that are sustainable. Primitive societies don’t give back more than they take because the people are so much kinder and better than we are; they do so because that’s what their societies are structured to accomplish, as systems. What they might deride as “low population density, inefficient technology and lack of profitable markets,” might more accurately be termed human-scale societies, simple, elegant technologies, and an emphasis on people over products in a gift economy. Of course this wasn’t the result of “conscious efforts at conservation.” Such efforts invariably fail. Their success laid in precisely the fact that it was not conscious, and that they weren’t efforts at conservation, but efforts to build wealth and prosperity for them, their children, their community, and the non-human communities they lived with. At the heart of agricultural philosophy, however, is a desire to escape this system: to have life, and never death; growth, and never decay; health, and never sickness. In the end, it is a fool’s dream doomed to failure, but the longer it goes on, the more death, decay, and sickness is needed to balance out the folly. We have been trained to think of violence as “wrong” since birth, because it is essential for the state’s continued existence that we accept the myth of its legitimacy, and the founding premise of that so-called legitimacy, the monopoly of force. Only by turning violence into an “evil” thing in the same way we demonize the Trickster can we willingly shed ourselves of such a natural function of animal life. Only then can we believe in the monopoly of force—and only then are we willing to accept its dominance over us, and the mythology of “legitimacy,” whereby our natural born freedom is relegated to “rights” for the state to arbitrarily grant or deny, seems not only reasonable, but the way things should be. Without accepting first that violence is “wrong,” none of the systems of control and domination follow. Not put aside when work was being done, infants remained constantly in touch with the activities of life around them, their tiny hands ever reaching out to whatever items or materials were in use, and onto the hands, arms, and muscles of the users. In this way even as tiny babes-in-arms they began accumulating a kinesthetic familiarity with the implements and activities of life. This familiarity, supplemented by a rapidly developing ‘tactile-talk,’ produced in toddlers an ability to manage objects and materials safely that might be dangerous elsewhere. When first sojourning in those southern hamlets, I was repeatedly aghast to see toddlers barely able to stand upright playing with fire, wielding knives, and hefting axes—without concern by anyone around. Yet they did not burn down their grass/bamboo abodes or chop off their toes and fingers. Humans never evolved to eat grains, yet we eat them almost exclusively. Even people who remain entirely civilized in all other respects but adopt a “Paleo Diet” eschewing grains note remarkable improvements in health. … All of this points to the shocking conclusion that when the human body is used in its proper evolutionary context, it is capable of the same amazing feats as other animals. We don’t need to sacrifice; we’ve already sacrificed too much. We need to demand more. We need to take our humanity back. […]

    Pingback by FIMBULVINTER - anarko-primitivism på svenska :: Jason om den ädla vilden :: May :: 2007 — 12 May 2007 @ 5:45 AM

  2. […] “The Savages Are Truly Noble,” The Anthropik Network, 10 May 2007: […]

    Pingback by The Anthropik Network » Noble or Savage? Both. (Part 1) — 11 January 2008 @ 7:44 PM

  3. […] “Noble Savage” teriminin ilk kullanıldığı yer İngiliz John Dryden’in 1672′de yazdığı oyunu “Grenada’nın Fethi”dir. Ardından iyi vahşi neredeyse ikiyüz yıl boyunca Batı dünyasında gözden kaybolur, ta ki 1859′da Londra Etnoloji Topluluğu Başkanı John Crawfurd tarafından yeniden keşfedilene dek. […]

    Pingback by Asil mi vahşi mi? « isyankar p!renses… — 15 April 2008 @ 12:08 PM

  4. […] “Noble Savage” teriminin ilk kullanıldığı yer İngiliz John Dryden’in 1672′de yazdığı oyunu “Grenada’nın Fethi”dir. Ardından iyi vahşi neredeyse ikiyüz yıl boyunca Batı dünyasında gözden kaybolur, ta ki 1859′da Londra Etnoloji Topluluğu Başkanı John Crawfurd tarafından yeniden keşfedilene dek. […]

    Pingback by GafBlog » Blog Arşivi » Asil mi vahşi mi? — 16 April 2008 @ 2:54 AM

  5. […] think Jason Godesky put the last nail in the coffin in his thorough article, “The Savages are Truly Noble.” Primitive peoples have an impact on their environment, it’s just a positive one. They fight, […]

    Pingback by Noble Savage Vs. Rewilding | Urban Scout: Rewilding Cascadia — 12 August 2008 @ 10:43 AM

  6. […] “Noble Savage” teriminin ilk kullanıldığı yer İngiliz John Dryden’in 1672′de yazdığı oyunu “Grenada’nın Fethi”dir. Ardından iyi vahşi neredeyse ikiyüz yıl boyunca Batı dünyasında gözden kaybolur, ta ki 1859′da Londra Etnoloji Topluluğu Başkanı John Crawfurd tarafından yeniden keşfedilene dek. […]

    Pingback by Asil mi vahşi mi? « Lady Lazarus — 12 November 2008 @ 6:31 PM


Comments

  1. If you print this out, it will take 59 pages. I don’t recommend that. That beats out our previous all-time record holder, “Israel’s Fascist Element,” by a solid 10 pages. This is our longest article ever posted on this site, adn I hope it will mean never having to write out the same arguments about the “Noble Savage” ever again.

    See also, “The Noble Savage,” which I published exactly two years ago.

    Comment by Jason Godesky — 10 May 2007 @ 3:23 PM

  2. Hi. I just found this site and wanted to let you know I am peeing my pants with joy.

    Comment by Diana M. Poole — 10 May 2007 @ 3:27 PM

  3. We’re glad to have you, but please, take a moment to change your pants first?

    Comment by Jason Godesky — 10 May 2007 @ 3:32 PM

  4. Or put on some adult diapers, like Astronaut Stalker Lady.

    Comment by Giulianna Lamanna — 10 May 2007 @ 4:14 PM

  5. Semi off topic, but let’s all take a moment to commemorate our brother Robert N esta Marley who passed from this consciousness on May 11 1981, a true philosopher and bringer of peace and equality to the world. He made some groovy music too. Plenty on youtube. Peace to all.

    Comment by Jason G — 10 May 2007 @ 4:22 PM

  6. Isn’t this a day shy of 26 years late?

    Comment by Jason Godesky — 10 May 2007 @ 4:28 PM

  7. http://www.daviesand.com/Papers/Tree_Crops/Indian_Agroforestry/, in regards to the original post, (excuse me, I havent read it all the way through yet), we will never be able to grow this land until we are allowed to burn it. I must keep reminding people that the fires here in Florida lately are caused by improper agro-forestry practices. Also, on a practical way to build a “rhizome” form of tribal society, integrated with a simple evolutionary ethic please visit: http://see.org/e-ct-dex.htm

    Comment by Jason G — 10 May 2007 @ 6:44 PM

  8. Hey Jason,
    This quote in particular seems quite important: [quote] But, as Raymond Kelly discusses in Warless Societies & the Origins of War, marginal societies tend not to have the resources to go to war, and affluent societies (like most foragers) have no reason to go to war. So, like agriculture (see thesis #10), warfare develops not from strict scarcity or abundance, but the nerve-wracking, extreme fluctuations between them, with periods of scarcity priming a population’s fears, and following abundance providing them the resources to act on those fears.[/quote]
    If we are aiming for at least partly a permacultural solution, then this might be the key towards being relatively peaceful food-producers versus violent ones.

    Comment by MatthewJ — 10 May 2007 @ 8:05 PM

  9. Very impressive. This has my nomination for an essential writing.

    Comment by Locke — 10 May 2007 @ 8:24 PM

  10. Damn dude well done! You covered a helluva lot of territory in this one. (No shit, right?)

    Comment by Andy — 11 May 2007 @ 4:41 AM

  11. I only had time to skim the article, but might I suggest you look into Tim Ingold’s work The Perception of the Environment. I think you will find it amenable to what you are proposing.

    Comment by Kevin Winters — 11 May 2007 @ 10:21 AM

  12. Hey Jason,

    any chance you might add a table of contents and section links/anchors to this one? I would find that terribly useful in citing/linking back and I think others would as well.

    shane

    Comment by shane — 11 May 2007 @ 2:59 PM

  13. Whoops! I just realized there is a t.o.c. near the beginning and the anchors are already there. Though having it at the beginning might be useful.

    shane

    Comment by shane — 11 May 2007 @ 3:01 PM

  14. I’ve always rather boastfully thought of myself as an exceptional writer, but I now humbly pass that sash over to you, Jason. (After I scrub the peanut-butter stains out of it, of course. :-D )

    Comment by venuspluto67 — 11 May 2007 @ 9:44 PM

  15. I thought that this news story that I spotted at reddit.com would be of interest as it would appear to touch on the matters discussed in this article.

    Comment by venuspluto67 — 12 May 2007 @ 11:39 AM

  16. Very educational and even a bit inspirational. The search for a utopia perhaps is why some believe primitivism is something more than just a more healthy and vigorous manner to be awake, some wish it to be the ‘heaven’ they search for.

    Dandelions are pretty tasty right now by the way. Late this year should begin the ‘glare’ of post-peak oil, guess we will see how things go, beyond prices going up.

    Carpe Diem

    Comment by Bubba — 12 May 2007 @ 11:52 AM

  17. A few disconnected thoughts.

    This essay is golden. Statements like–

    “Part of our civilization’s twisted view of the world is its inability to come to terms with violence in a healthy way…”

    “…even the healthiest of us have been chronically ill for our entire lives.”

    –succinctly articulate something fundamental which few people see, much less are able to put into words.

    1. I tried to read Pinker’s essay only a few weeks ago. I can’t say for sure, but I think he is being deliberately deceptive. His thesis (”Violence has been in decline over long stretches of history…”) indicates his ambivalence about actual dates. In other words, you can look at different long stretches of history, and there are many time periods which begin with a higher level of violence and end with a lower level of violence. Thus he confounds, as you hinted at, the shift from hunting and gathering to agriculture with the shift from medieval Europe to Modern Europe with the shift from modernism to industrial modernism. It’s hard to keep track of the equivocations. Which is probably why he throws in this: “The decline of violence is a fractal phenomenon, visible at the scale of millennia, centuries, decades, and years.” Because he’s smudging a bunch of datums together and calling them data. So when he says, “…despite all these caveats, a picture is taking place,” what he means is “because all these caveats.”

    2. Can you cite the 1-in-37-incarcerated statistic? I don’t doubt it’s true, but I’d like a source.

    3. “They were friendly in their dispositions, honest to the most scrupulous degree in their intercourse with the white man. Simply to call these people religious would convey but a faint idea of the deep hue of piety and devotion which pervades their whole conduct. Their honesty is immaculate, and their purity of purpose and their observance of the rites of their religion are most uniform and remarkable. They are certainly more like a nation of saints than a horde of savages.”

    This seems obviously to be a set of projections more telling of the mind of the European making the statement than of the people being observed. For instance, the words “piety” and “devotion” seem to suggest the struggle with religion in the European-American mind. Religion is hard, the assumption goes, and therefore it takes extraordinary strength of character to adhere to religious observance. I.e., the author’s religion doesn’t satisfy him.

    4. “The manner in which primitive children are raised includes points that make many current “experts” scream, such as cosleeping with parents, or breastfeeding fully to the age of five in some cases.”

    Don’t forget that for a significant period in the 20th Century in America, breastfeeding itself was considered dangerous.

    5. “We normally miss microexpressions, the tensing of muscles, and other “subtle” cues that should be plainly evident, for the simple fact that most of us were not brought up in such a way. It’s almost like being deaf in the midst of a whole society of deaf people; none of us can really appreciate what it is we’re missing.”

    Actually, to say this is the problem with Westerners is an oversimplification. It is found in varying degrees throughout the Western World, yes, but I’ve had some anecdotal evidence that it’s even more pronounced among American whites than either European whites or American non-whites. It’s been repeatedly said to me by both Europeans and American non-whites that American whites have a eerily flat affect (and not just me!).

    6. “…the letters each refer to actual things in the sensuous world, so there is still a point of reference in the living world around us.”

    But it is also a radically abstract reference to the living world, reduced to 22 such symbols.

    Interesting that early Hebrew writing, as Phoenician before it, alternated direction right-to-left, left-to-right, and back again. The practice was called “ox-turning,” an agricultural metaphor referring to the way fields are plowed. Even the principle name of God (”El”) is a combination of the letters Aleph (”ox”) and Lamed (”ox-yoke”), suggesting the same agricultural metaphor. Again, the names for the letters of the Hebrew alphabet originate with the Phoenicians, not the Hebrews. The Hebrews did, however, come up with their own meticulously ornamental letter forms.

    Comment by thistle — 12 May 2007 @ 1:43 PM

  18. Wow! A huge amount of work went into this article. I imagine this is going to become a definitive reference for a lot of folks.

    “Such mind-blowing abundance was the accumulated gift of hundreds, if not thousands, of generations, each one living to give back more than they took.”

    This reminds me of something I saw when I lived in Portland. When Mt. St. Helens blew in 1980, it wiped out ungodly amounts of forest. When it was over, some group — the Forest Service maybe — went into some areas and did what they could to clean up the landscape, and then replanted the trees. Other areas they left untouched, so they could monitor the “natural” regrowth of the “wilderness.” The difference between these areas 20 years later was pretty staggering. The replanted sections of forest are clearly recovering much faster, to the benefit of all the species who live there. (I do not know if they’re using chemical fertilizers and such, but I am inclined to think not except maybe in specific instances, due to the sheer scale of the project.)

    It reminds me also of something from Judeo-Christian mythology. Adam and Eve lived in the Garden of Eden before they were condemned to agriculture. A lot of people, including me, consider this an oral-traditional allegory for hunter-gatherer life prior to the advent of farming (this belief is itself a mythology, but that’s another subject). In the biblical narrative, Adam and Eve are not placed in “the wilds” or some such, but in the Garden “to tend it and to take care of it.”

    Comment by Paula — 12 May 2007 @ 1:56 PM

  19. Hi Jason… I very much enjoy your writing and how it clarifies many issues.

    My familiarity with many of your sources is merely at the consumer level, so I can’t evaluate much in depth. However, as an epidemiologist, one thing I have to ask about is your claim of low (or zero) cancer rates. Are these rates essentially age adjusted, or not? Because a critic could always argue that the pre-literate had low cancer rates only because so few of them survived to old age. Is that true? Or is there evidence that many pre-literate peoples lived to an advanced age, but still very few of them developed cancer? I would like to hear your thoughts.

    It’s worth bearing in mind that the greatest recent gains in life-expectancy have to do with limiting infant mortality, so that if birth rates are taken into account (under the assumption that saving unfit infants is of ambiguous benefit… a delicate issue that touches on certain ethical considerations) our apparent life expectancy would seem shorter.

    Comment by slomo — 12 May 2007 @ 3:34 PM

  20. It’s been about a month I found this site, while searching for post-crash living and the like. I have been taking my time (which currently is very few, exactly because of my job, which takes away 12 hours a day if not longer) reading slowly this site, and now I can finally say I have finished the reading.

    What can I say, I loved most the 30 theses than anything on the site (also the video on life seen by the perspective of the rocks was very interesting).

    Here in Brasil my family owns a 800.000 square meters property (more than half of it being untouched forest) in which I too am making plans to make some sustainable living, mostly by horticulture, low-impact animal hearding and of course, probably some hunting and gathering. The property has trhee small rivers within the forest, with good unpolluted surface water.

    Brasil doesn’t use as much oil as USA, and most of our electricity comes from hydro-power, however everybody knows that hydroelectric need plastics, oils, and even manutention only made possible by some kind of fossil energy. We may also have sugarcane alcohol here, though it’s very subdised by the government and no matter how much I ask to some people I know, if the energy needed to make one liter of alcohol takes less or more than one liter of alcohol, no one really answers me. As if they didn’t know, or didn’t want to know if it is afterall energy-worthy or not.

    I don’t see much solution in the future, and although I live happily with my computer, internet, (no car however, too costly to maintain) I am probably the only medical doctor - specialization on Laboratory Medicine - that I know who owns a bicicle…

    Despite working int a high-technology field, I am hoping I’ll be able to ‘buy’ my way out of civilization the most I can. I pushed my family into getting the rural property, I helped paying it, and I am only waiting for my mother’s retirement - when she will live there - to start buying some hardy livestock that can live with the minimal possible care and who will need no rations to live.

    I used to feel somewhat guilty for *wishing* some collapse to come by… after fiding this site, and many others, I realised the wish for collapse is bigger than I imagined.

    Maybe one day we will still see it, although sometimes I think - sadly - it will not happen in our lifetime. But who knows…

    Comment by Denise Silveira — 12 May 2007 @ 10:29 PM

  21. Jason,

    As noted above, a staggering, excellent compendium of noble savage-related information.

    One question regarding the Iroquois cannibalism, and more broadly, primitive violence: Were the Iroquois foragers? Zerzan suggests that, in addition to many of the accounts of primitive viciousness being exaggerated or simply fabricated, it’s important to note that these cultures were typically not foragers, strictly speaking. Many cultivated crops or had domesticated dogs, for example. Considering the evidence that foraging and horticulture exist on something of a continuum, perhaps this is a moot point, based on a dichotomy that wasn’t there. But in any event, can you speak to argument that this sort of violence was atypical to foragers and seems to occur only among domesticators?

    More broadly, how do you countenance the fact that horticulture is also a fairly recent practice in human history? (Or is it? Maybe that notion is outdated and disproven?) Perhaps even perma-/horticulture is a mistaken departure from our long history as foragers, and worth utilizing in the transition away from empire, but not necessarily a long-term strategy? I know elsewhere you’ve mentioned that there’s no such thing as a “pure” forager- why not?

    Anyway, kudos!

    Comment by Archangel — 12 May 2007 @ 11:16 PM

  22. I used to feel somewhat guilty for *wishing* some collapse to come by… after fiding this site, and many others, I realised the wish for collapse is bigger than I imagined.

    I’m a relative newcomer myself, but I understand that there was some unpleasantness over that very matter here that led to a somewhat acrimonious falling out. It’s not so much wishing the collapse would happen. After all, it’s going to mean historically unprecedented suffering and death. It’s more a recognition of the fact that nature’s design is for us to live very simple lives at low levels of population, and the more we attempt to defy that, the more intense and ghastly will be nature’s restoration of the accustomed equilibrium.

    Comment by venuspluto67 — 13 May 2007 @ 10:13 AM

  23. Wow, well this has certainly garnered quite a response! Thank you all!

    If we are aiming for at least partly a permacultural solution, then this might be the key towards being relatively peaceful food-producers versus violent ones.

    I believe so, but this is largely out of our hands. Good years and bad years happen. Fortunately, the back-and-forth you need to spark agriculture (and war) is relatively rare–really only in a Holocene-esque interglacial.

    Can you cite the 1-in-37-incarcerated statistic? I don’t doubt it’s true, but I’d like a source.

    Gail Russell Chaddock, “US notches world’s highest incarceration rate,” Christian Science Monitor, 18 August 2003.

    This seems obviously to be a set of projections more telling of the mind of the European making the statement than of the people being observed.

    Absolutely. That was used to illustrate the notion of the “Noble Savage” with regards to honesty; as with the other illustrations of the myth, it says much more about European attitudes than Native lifeways.

    Actually, to say this is the problem with Westerners is an oversimplification. It is found in varying degrees throughout the Western World, yes, but I’ve had some anecdotal evidence that it’s even more pronounced among American whites than either European whites or American non-whites. It’s been repeatedly said to me by both Europeans and American non-whites that American whites have a eerily flat affect (and not just me!).

    Naturally, there’s some variations, but I don’t think you can fairly say it’s simply an American trait, either. After all, we got it from all of that English “stiff upper lip” stuff ourselves, and we added a note of downright gregariousness (which often crosses the line into obnoxiousness and crassness), which are equally stereotypical American traits. But as a whole, I think it’s fair to say that all civilized cultures are a world apart from all primitive cultures in this respect.

    But it is also a radically abstract reference to the living world, reduced to 22 such symbols.

    Meh, I’m not so sure. After all, they’re broad symbols, and deeply metaphorical. Biblical Hebrew has one of the smallest vocabularies in the world, and the result is that most words have multiple meanings: thus, it’s difficult to not speak in metaphors and poetry. The farming paradigm seems pretty spot-on, but there’s still plenty of the living world breathing through some writing systems. I don’t have a problem with abstraction; otherwise, I’d have to reject all humans everywhere as “fallen.” Abstraction can help you relate all the more to the living world–or it can tear you apart from it. The question of whether it’s abstract or not is a silly one; the better question is what does the abstraction accomplish, does it put you in touch with the sensuous world, or break you off from it?

    Wow! A huge amount of work went into this article. I imagine this is going to become a definitive reference for a lot of folks.

    I hope so; if we can consider these points more or less settled, we can move on to more interesting territory. I’m tired of having these same arguments all the time, so I hope from now on, I can just link to this, rather than having to go through the whole argument over and over again.

    It reminds me also of something from Judeo-Christian mythology. Adam and Eve lived in the Garden of Eden before they were condemned to agriculture. A lot of people, including me, consider this an oral-traditional allegory for hunter-gatherer life prior to the advent of farming (this belief is itself a mythology, but that’s another subject). In the biblical narrative, Adam and Eve are not placed in “the wilds” or some such, but in the Garden “to tend it and to take care of it.”

    See also, “Wilderness & Its Troubles.” Hunter-gatherers and horticulturalists exist on a continuum; horticulture only works on a society-wide scale when it’s supplemented with hunting in a healthy “zone 5,” and hunter-gatherers always use techniques to favor the regrowth of their favorite plants to one degree or another, so the difference is primarily one of emphasis.

    However, as an epidemiologist, one thing I have to ask about is your claim of low (or zero) cancer rates. Are these rates essentially age adjusted, or not?

    According to the source linked in the footnote, they are: “Medical anthropologists have found little cancer in their studies of technologically primitive people, and paleopathologists believe that the prevalence of malignancy was low in the past, even when differences in population age structure are taken into account.” Eaton S.B., Pike M., et al. (1994) “Women’s reproductive cancers in evolutionary context.” The Quarterly Review of Biology, vol. 69, pp. 353-367.

    Of course, the idea that hunter-gatherers have shorter lifespans than agriculturalists is largely a myth, so age correction doesn’t correct for much. But it has been corrected for, whatever effect it had.

    Because a critic could always argue that the pre-literate had low cancer rates only because so few of them survived to old age.

    My first objection would be to the notion that hunter-gatherers don’t live as long. They generally live longer than farmers or pastoralists in the area. But even so, it’s still been corrected for.

    Or is there evidence that many pre-literate peoples lived to an advanced age, but still very few of them developed cancer?

    Absolutely, as discussed in thesis #25, forager lifespan is roughly equivalent to the lifespans we see in the industrialized world today.

    Were the Iroquois foragers? Zerzan suggests that, in addition to many of the accounts of primitive viciousness being exaggerated or simply fabricated, it’s important to note that these cultures were typically not foragers, strictly speaking.

    The Haudenosaunee were textbook examples of horticulturalists. See also, “Exceptions that Prove the Rule, #1: The Iroquois.” But even so, Zerzan’s point only gets you so far. There’s a continuum between foragers and horticulturalists, and foragers have been known to engage in some pretty dastardly violence of their own. The !Kung and the Inuit do kill each other, sometimes brutally.

    Many cultivated crops or had domesticated dogs, for example.

    Dogs are the oddball of human domestication; that’s a genuine case of co-evolution, and involved as much change on our parts as dogs’. See “Wolves & Dogs” and “Alpha Dogs, Wolf Packs & the Wandering Free Families.” Zerzan often sticks to a hardline that eliminates all humans. Many hunter-gatherers make pets of the infants of animals they kill, and release them once they’ve grown up. Dog “domestication” far predates the domus stretching back into the mists of human evolution. Such purity simply does not exist, so the distinction only gets us so far.

    But in any event, can you speak to argument that this sort of violence was atypical to foragers and seems to occur only among domesticators?

    It’s a mythical distinction. !Kung are highly trained archers carrying about poison-tipped arrows, and Inuit blood feuds are legendary. They know how to kill people, and how to wage war, and when it comes down to it, they do. All animals fight; most animals limit such violence. Only in civilization do you see it reach the levels we see.

    More broadly, how do you countenance the fact that horticulture is also a fairly recent practice in human history?

    Well, certainly some level of favoring the regrowth of favorite plants is incredibly ancient; we can see as much in other animals, as well. Active horticulture is only 10,000 years old as well. So far, it seems sustainable, but you’re right, as I’ve said before, it may simply be unsustainable on a slightly longer timeline. It could be that it only seems to work next to agriculture. But this isn’t something we can really say, one way or the other. Right now, it seems sustainable, and it’s certainly a step in the right direction. If nothing else, permaculture offers perhaps the best alternative for healing the damage we’ve done, reconciling us to the living earth, and creating an environment favorable to hunter-gatherers.

    I’m a relative newcomer myself, but I understand that there was some unpleasantness over that very matter here that led to a somewhat acrimonious falling out.

    Indeed. I can certainly understand it as an ill-considered outburst, but collapse isn’t something to be longed for or hoped for. What comes after it might be, but collapse itself will be a very painful process, including great suffering and death. We are in the midst of great suffering and death now, so collapse’s only advantage over the present is that collapse ends. Of course, to most people you live with, attuned only to the suffering of their own species, collapse will be the first time they note any such suffering. It’s not that their suffering is so much more worthy, but rather, that no suffering is ever something to look forward to, not even the last spasm that ends this all.

    Comment by Jason Godesky — 14 May 2007 @ 11:44 AM

  24. I know it’s a bit of a tangent, but I see that a few posts have been relating to perma-/horticulture and “how to move forward”. Besides, I’ve been meaning to dig this up for a while now. This is from:
    [url]http://www.motherearthnews.com/DIY/1982-07-01/The-Plowboy-Interview-Masanobu-Fukuoka.aspx[/url]

    [b]FUKUOKA: [/b]I had started experimenting in some of my father’s mandarin orange orchards even before the war. I believed that—in order to let nature take its course—the trees should grow totally without intervention on my part, so I didn’t spray or prune or fertilize . . . I didn’t do anything. And, of course, much of the orchard was destroyed by insects and disease. The problem, you see, was that I hadn’t been practicing natural agriculture, but rather what you might call lazy agriculture! I was totally uninvolved, leaving the job entirely to nature and expecting that everything would turn out well in the end. But I was wrong. Those young trees had been domesticated, planted, pruned, and tended by human beings. The trees had been made slaves to humans, so they couldn’t survive when the artificial support provided by farmers was suddenly removed.

    [b]PLOWBOY:[/b] Then successful natural farming is not simply a do-nothing technique?

    [b]FUKUOKA: [/b]No, it actually involves a process of bringing your mind as closely in line as possible with the natural functioning of the environment. However, you have to be careful: This method does not mean that we should suddenly throw away all the scientific knowledge about horticulture that we already have. [i]That course of action is simply abandonment, because it ignores the cycle of dependence that humans have imposed upon an altered ecosystem.[/i] If a farmer does abandon his or her “tame” fields completely to nature, mistakes and destruction are inevitable. The real path to natural farming requires that a person know what unaltered nature is, so that he or she can instinctively understand what needs to be done—and what must not be done—to work in harmony with its processes.

    Emphasis mine. I’m inclined to agree with Fukuoka on this one, and that’s one reason I support permaculture (and more importantly, rewilding of humans & habitat as an intertwined process) as a “go-forward” step.

    Comment by jhereg — 15 May 2007 @ 9:24 AM

  25. That also hints at one of the things that excites me most about permaculture, something Fukoka doesn’t seem to address—namely, if domesticates have essentially been enslaved, shouldn’t we be using permaculture to help rewild them?

    Comment by Jason Godesky — 15 May 2007 @ 1:39 PM

  26. Great job Jason. very very good article, and I believe it does indeed “put the Noble Savage Myth to rest”.

    Comment by Rory — 15 May 2007 @ 1:51 PM

  27. something Fukoka doesn’t seem to address

    Actually, he does. He’s whole-heartedly fond of un-enslaving domesticate plants. The one and only exception I found was in reference to the citrus trees he grew, and even then he pretty much says that he would prefer to just let the trees cross & hybridize (essentially move toward being wild again), but the market wouldn’t have accepted the results and citrus fruit was a big part of how he stayed afloat.

    A few years ago, I set aside a spot to try his proposed orange tree rewilding program (which, I don’t think he ever did, see above for reasons) with tomato plants, the result is now a tomato patch in central Ohio that’s never replanted, the soil is never tilled, minimal compost, no chemicals, with very vigorous and productive plants. The one caveat is that the fruit varies widely and may show surprising traits. Some plants will have beefsteak-like texture, but be a little larger than a cherry tomato, for example. But I’m not doing this for market, so….

    Comment by jhereg — 15 May 2007 @ 1:59 PM

  28. Thanks Jason for another reminder of how I was so damn lucky to stumble upon this site. I never quite made it to peeing my pants, but maybe that’s only because my tyrannical intellect forbids any kind of 2-way dialogue with the rest of my internal organs ;)

    1. - I’d not fully appreciated the importance of Quinn’s caveat: tribes “wherever they are found uninfluenced by civilisation [or words to that effect]”. Hearing Sorenson’s description of collapsing pre-conquest consciousness is a deeply painful reminder of what we’ve all lost since childhood, and as societies, since agriculture & domestication.

    2. - “Epidemic sleeplessness, frenzied dance throughout the night, reddening burned-out eyes getting narrower and more vacant as the days and nights wore on…” - you can now see all these symptoms among civilised peoples. Our own culture is meaningless and dead to us these days, especially if we’re young.

    3. - “Heretics are afflicted in a similar fashion as are those who labor under a disease…” I love it when puritans bring up epidemiological analogies to denounce heretics, because everything they say can be turned completely inside out, so they start describing their own pathologies better than any immune outsider ever could. If I can self-promote a bit (more), here’s a short poem I wrote after reading some Dawkins a while back:

    Meme Warfare

    Being the Ease of Mind
    Sent now into a diseased landscape
    To cure.

    No wonder the Diseased Mind
    Will see in our wake a plague of decay and corruption,
    Will of course do its best to stem the flow of our tide:
    Naturally with no consideration of where it has its source.

    Not every contagion thinks itself the cure
    Theirs and ours are the exception
    And we’ve got the better cards.

    Comment by Ian M — 15 May 2007 @ 6:04 PM

  29. I saw a TV show the other month about whistling. Some southern European (I think) village in a mountainous area has a whistling language - quite complex. It’s ’spoken’ almost as much as the other local dialect. Whistles propagate over ridges better than other oral sounds. In the show they had one person on the next ridge, who was told to ‘pick a yellow flower’ via the whistling dialect, which he did without hesitation. Verbs, adjectives and nouns - not bad!

    Comment by Steve Z — 17 May 2007 @ 7:15 AM

  30. That last comment was in relation to the Koyukon language, BTW.

    Comment by Steve Z — 17 May 2007 @ 7:19 AM

  31. Excellent stuff, Jason - a very gratefully received effort which hits all the right spots with a great balance between readability and detail.

    I had to laugh when I read your first comment on my print-out, though: “If you print this out, it will take 59 pages. I don’t recommend that.” I knew I wanted to properly digest this after reading the intro, so I immediately printed it - after saving it first, and hacking the CSS a bit. I think printing on Anthropik could be helped loads with a print stylesheet that allows the content to span the page width :-)

    Comment by Gyrus — 22 May 2007 @ 6:12 PM

  32. Hmmm… I’m suddenly reminded of all those “print version” links on newspaper and magazine websites. We could do something like that here. That’s a great idea, Gyrus!

    Comment by Giulianna Lamanna — 22 May 2007 @ 6:42 PM

  33. This just in…

    ‘Fireballs set half the planet ablaze, wiping out the mammoth and America’s Stone Age hunters’
    http://observer.guardian.co.uk/world/story/0,,2083758,00.html

    Certainly adds weight to the idea that megafauna extinctions weren’t the responsibility of hunt-happy humans… In loads of other ways, too, a fascinating and quite dramatic revelation.

    Comment by Gyrus — 23 May 2007 @ 8:33 AM

  34. See also:

    http://space.newscientist.com/article/dn11909-did-a-comet-wipe-out-prehistoric-americans.html

    “The comet-strike also offers a third and radical hypothesis for the massive extinction of mammals, which for years palaeontologists have blamed on the sudden Younger Dryas freeze, combined with the hunting prowess of newly arrived Clovis bands.”

    Comment by Gyrus — 23 May 2007 @ 8:41 AM

Close
E-mail It