Saturday, January 17, 2009

Eurozine - Culturalism: Culture as Political Ideology


This is an excellent article from Eurozine demonstrating that both the liberal view (multicultural respect for all cultures) and the conservative (nationalistic protection of one's own culture) are simply variations on the same belief that individuals are formed and shaped by the culture in which they live. Both views also maintain that culture should be protected from corruption, in whatever form that might take.

From an integral view, both are correct, but partial. We are, indeed, embedded in our cultural environment in many ways, and thus shaped by it. However, the moment we become aware of this fact and begin to observe it is the moment at which it is no longer the dominant determinant of who we are and how we define ourselves.

When we begin to treat culture as an object of our awareness, we can begin to dis-identify with its power over us and become unique, post-cultural individuals.

Anyway, here is the article (it's very long, but worth the read). It takes a different angle on the topic of culturalism than I do, but it's an interesting one. I don't necessarily agree with thier anti-Islamic rhetoric, but they are on the right path for the wrong reasons. Fundamentalist Islam is dangerous, but not because of its political beliefs. Rather, it is a pre-rational expression of cultural, one embedded in a tribal, mythic view of the world.

More advanced versions of Islam, such as those practiced by many American Muslims, are rational and even multicultural in their worldviews and, thus, not in any way dangerous. We need to get to a point where we can understand that it is not specific religions that are dangerous, but rather the worldview through which they are understood.

Culturalism: Culture as political ideology

By Jens-Martin Eriksen, Frederik Stjernfelt

The controversy on multiculturalism has changed the political fronts. The Left defends respect for minority cultures while the Right stands guard over the national culture. But these two fronts merely constitute two variants of a culturalist ideology, argue Jens-Martin Eriksen and Frederik Stjernfelt.

Culturalism is the idea that individuals are determined by their culture, that these cultures form closed, organic wholes, and that the individual is unable to leave his or her own culture but rather can only realise him or herself within it. Culturalism also maintains that cultures have a claim to special rights and protections – even if at the same time they violate individual rights.

The culturalism of today, in which culture becomes a political ideology, thrives on both the Left and the Right. Most well known is leftwing multiculturalism, which has a radical, anti-democratic variant as well as one that suggests that it is possible to harmonize multiculturalism and (social-) liberal views. However, multiculturalism can also exist in forms that belong to the far Right, such as the French concepts of ethnopluralism, the idea that all cultures have the right to autonomy as long as each remains in its own territory. This approach results in political conclusions to the effect that immigrants must either allow themselves to be assimilated lock, stock and barrel, including everything from their religion down to their cuisine, or else return to their original native countries (assuming that such countries exist).

Culturalism has an entire range of categories in common with nationalism; indeed, nationalism in reality constitutes a subvariant of culturalism, in which a single culture provides the basis for the state. Therefore it does not come as a surprise that the present nationalist renaissance in European politics makes use of culturalist ideas to a great extent. On the domestic stage, the Danish People¹s Party is the obvious example in its re-adoption of Danish nationalist ideas from the nineteenth and early twentieth centuries, including its radical anti-Enlightenment stance. Since the Mohammed cartoon controversy, the party has felt a strategic need to join the defenders of freedom of speech against Islamist machinations. And irrespective of what one can surmise to be the motives for this about turn, it has to be noted that it was possible only as a result of the party claiming freedom of speech a "Danish value", as though it were a homegrown invention. This is naturally a culturalist falsification of history: freedom of speech is not a Danish invention. Its roots are of course found in international enlightenment movements; freedom of speech is a high-quality import. It is something that liberal and democratically-minded forces, by dint of great effort and at great cost to themselves, managed to force through in the face of Danish absolutism and the Danish State Church until the right was formalised in the June Constitution of 1849.

An immediate problem in Denmark – and also international politics – is therefore that there is culturalism on both sides of the political spectrum. On the Left we hear culturalist battle cries calling for the recognition of the most anti-modern and unappetising cultural practices; on the right we hear the battle cry of Danishness and the reawakening of a most anti-modern and unpalatable Danish nationalism. These two versions of culturalism are natural enemies, even though they base themselves on the same spurious system of ideas. For a hundred years, French and German nationalisms were each other¹s main opponents, yet frequently drew on exactly the same intellectual heritage. One culturalism is the automatic enemy of the other precisely because culturalisms are naturally particularisms, which is to say, they each select their chosen people – and not all people can be equally chosen. But this evil and strident antiphon of particularisms, in which the reinforcement of the culturalism of the Left frightens more voters to move towards the culturalism of the Right and vice versa, ought not to persuade anyone into believing that culturalisms of the Left and the Right constitute the main antithesis in modern politics. On the contrary, the conflict is between Enlightenment and culturalism – between democracy, political liberalism, the rights of the individual, universalism and the Enlightenment on the one hand, and on the other hand the unenlightened maintenance of culture, tradition and authenticity, and the conservative opinion that the individual is linked by fate to a specific culture.

There are consequently two kinds of criticism of Islam that often sound as though they were related, but which must not be confused. One of them criticises Islam as such because it is a foreign religion that is irreconcilable with Danish values and Danish traditions. This is the criticism of one culturalism expressed by another; it is Jesus Christ against Mohammed. It is one mythological figure in fateful conflict with another. The other criticism, meanwhile, attacks Islamism, not because it is un-Danish, but because it is a totalitarian ideology related to the various forms of totalitarianism during the inter-war period in Europe. This criticism is an informed criticism of a political movement that is opposed to the open society and fundamental democratic principles. This criticism is not directed at Islam as such, but rather focuses on ideological, political and social barriers that cut off individuals from his or her rights. Whether these barriers have their background in cultural, political, religious or other dogmas is ultimately irrelevant.

There is scarcely a more important task in contemporary politics and political philosophy than giving full consideration to developing universal Enlightenment and with the greatest possible force turning against both the prevailing right and leftwing forms of culturalism and their enslavement of the individual in his or her own "culture".

A glance at the criticisms directed by the Left at the culturalism of the Right provides a point of reference of how far the Left has strayed from its starting point in the Enlightenment. It also reveals how little the Left actually knows about its political opponents in the battle that has developed over the last few decades, during which the question of culture has appeared on the agenda and gradually replaced prior debate on divergent political utopias.

Let us take a look at the task facing left wing culturalism and at the way in which the two culturalisms are blind to the similarities between them. In Denmark, it is remarkable that since the defeat of the Left in the parliamentary elections in 2001, leftwing culturalism has not yet been able to produce an analysis of its ostensible opponent, the Danish People's Party. It seems that many years after its defeat, the Left has not been able to move on. It continues to base its ideas on what it sees as the only thing applicable to the Right: that it is "racist" and that the voters the Right has succeeded in mobilising are either "racist" or suffer from other psychological defects such as "Islamophobia". Political analysis seems in some way to have been taken over by a rather slipshod social-psychological diagnostic. Naturally, this finds expression in repeated accusations of racism aimed at rightwing culturalism.

In his book Islams and Modernities, the Syrian philosopher Aziz Al-Azmeh points out that differentialism, which is a more generalised concept for racism, has undergone what he calls a "de-racialisation". "Race" is no longer used as a valid form of identification, and all that is left is the culturalist argument. In Denmark, the Danish People¹s Party should be understood as being a culturalist party whose attitudes are an expression of a modern differentialism. No major political movement in Denmark or anywhere else in Europe bases its platform on racism. Such a position is no longer held by an elite and is not represented by any but radical losers without political significance.

But why is the Left unable to diagnose culturalism in its political opponent and to launch an offensive against the opinions that the party really represents? Logically enough, this is due to the fact that they allow themselves to be blinded by the same cultural views as their homologous opponents: they are themselves culturalist. And this naturally establishes limits to the extent to which they are able to analyse their opponents' position.

Both culturalisms express respect for cultural differences and espouse their belief in the protection of these identities. Right and leftwing culturalists merely maintain these protective measures under various guises. Leftwing culturalists claim that various distinct cultures should be able to co-exist on the same territory or in the same state, where, formally or informally, different jurisdictions for individuals are applied, according to the cultural group into which they were born. Rightwing culturalists maintain the same attitude towards preserving cultural identity, but each culture in its own territory, each culture in its own country.


An important and frequently overlooked effect of the growing importance of the two forms of culturalism on contemporary politics is that social groups that had previously organised themselves on the basis of "interests" are now increasingly organising themselves on the basis of "culture". This naturally divides these groups politically.

British philosopher Brian Barry writes that:

The proliferation of special interests fostered by multiculturalism is [...] conducive to a politics of 'divide and rule' that can only benefit those who benefit most from the status quo. There is no better way of heading off the nightmare of unified political action by the economically disadvantaged that might issue in common demands than to set different groups of the disadvantaged against one another. Diverting attention away from shared disadvantages such as unemployment, poverty, low-quality housing and inadequate public services is an obvious long-term anti-egalitarian objective. Anything that emphasizes the particularity of each group's problems at the expense of a focus on the problems they share with others is thus to be welcomed.[1]

If underprivileged groups can be persuaded to become more concerned with religion, culture and identity, they will be split, and the focus will be moved away from concrete political problems. The current configuration in Danish politics, in which many disadvantaged Danes support the culturalist Right, while immigrants and multiculturalists support the Left, is a striking example of this phenomenon. It probably constitutes one of the main structural reasons for the profound crisis in the Social Democratic Party, whose core voters are now distributed according to cultural affiliation rather than their own interests. The question poses itself as to how long the Social Democratic Party and the rest of the Left intends to allow itself to be guided by the delusion of culturalism.


The Left's progressive involvement with the hardline concept of culture both in Denmark and internationally is one of the most important and least recognised political developments of the last thirty years. Culturalism, in its political and leftwing forms, is by no means a recent phenomenon. Its first appearance on the world stage came in 1947 when American anthropologists attempted to derail the UN Human Right's Charter. They refused to accept that it was possible to presume universal human rights, since this would suppress individual cultures. However, the Western Left – whether in its Communist, Social Democratic or social liberal variants – was at that time so international in its views that culturalism remained below the surface. Meanwhile, in the 80s and 90s a vacuum was created by the demise of Marxism and its role as a reference point for leftwing parties in the West. The profoundly conservative cultural ideas of culturalism subsequently and surreptitiously moved into this arena. The surprising thing is that this transformation took place largely without a blow being struck – although culturalism is in many respects diametrically opposed to Marxism. Whereas Marxism maintains that culture is a superstructure on social economic regularities, in contrast culturalism will say that the economy of a society depends on its culture and the value systems of that culture, or at least that the economy is indistinguishable from all other cultural features in the society in question.

In this way, culturalism constitutes a kind of anthropological counter-revolution that turns Marxism on its head. If one reflects on the argumentation of the left in the 1960s and 1970s – in those days it was above all the economy, the class struggle, means of production, sociology, political systems and resources that were seen as crucial, and it was quite rare and peripheral for the term "culture" to appear. The reverse now applies, and culture attracts far more attention than economics and society – but there has never been any major confrontation in which one model was exchanged for another, as might be expected in ordinary political debate. There have been no furious confrontations between parties concerning the absolute importance of the economy or culture. The transition between opposites has been achieved through a gentle transformation, almost from one day to the next, often without the figures embodying the two attitudes being aware of what was taking place. This is perhaps due to the fact that both Marxism and culturalism have an even simpler and deeper pattern in common: the phenomenon of an oppressed group in relationship to the dominant majority. It is then possible to take the political side of the oppressed following the leftwing slogan of the 1970s: "An oppressed people is always right". This was understood quite literally, with implications that far surpassed the argument that an oppressed people have the right to be liberated from their oppression. They were now right with respect to all their cultural dogmas, regardless whether what these dogmas maintain is just or true; what was important was that they were derived from the culture of an oppressed people. An argument purely ad hominem. It was thus possible to replace the working classes with "the oppressed culture" – even if the implication of this was that emancipation was to be replaced by disciplined culturalism, which maintains antiquated and pre-modern norms – which is to say, an absolute reversal, both in terms of philosophy and values, of what the Left used to stand for.

In her book La tentation obscurantiste, the French journalist Caroline Fourest presents an interesting hypothesis regarding the advance of what we call leftwing culturalism. She notes that the two great prototypical points of identification for the European Left during and after World War II were the anti-totalitarian struggle on the one hand and decolonisation and anti-imperialism on the other. For a long time they were able to co-exist without conflict; but, following the important growth of Islamisms in the Islamic countries and among Muslim immigrant groups, the Left found itself divided according to which of the two principal causes was considered most important. If the anti-totalitarian struggle was considered crucial, people tended to turn against Islamism as yet another form of totalitarianism from the inter-war period. But if the anti-imperialist struggle was considered paramount, the tendency was to support Islamism as a legitimate challenge to Western imperialism, at first in the colonialist version and subsequently in the globalised version. This latter choice naturally opened up the Left to culturalism. This turns out to be a twofold problem for the hardline, multi-cultural left wing: culture means at once too little and too much. On the one hand, it is very important, in that it provides an individual with an identity and therefore the right to political care and protection – conservatism built into the culturalist concept of culture. On the other hand, the Left has historically maintained that culture has no meaning, for it is economic and social conditions that are the critical determining factors. Yet at the same time, this Marxist doctrine is behind the multiculturalist idea that all cultures, irrespective of how anti-democratic and anti-liberal they are, can a priori co-exist in the same society. This duality is naturally a constant source of confusion for hardline leftwing multiculturalism. Culture is at once an immutable source of profound identity and at the same time a purely surface phenomenon based on economic determinants. It is naturally impossible for both to be true.


One encounters the concept of "Islamophobia" with increasing frequency; it is used to stigmatise any criticism of Islamism and aspects of Islam that conflict with democracy, human rights and the constitutional state. The hardline, culturalist concept of culture throws a crucial light on the problems raised by the word, first deployed by the Islamic world organisation OIC in its struggle against human rights, and more specifically the freedom of speech. The campaign against the cartoons published in the Danish newspaper Jyllands-Posten and the subsequent Mohammed crisis was an example of this struggle. The word "Islamophobia" is increasingly used by Islamic organisations and hardline multiculturalists in an attempt to limit criticism of Islamic movements. Both valid and unfounded criticisms of various forms of Islam are brushed aside by the argument that they constitute "Islamophobia" – and are thus grouped with racism, anti-Semitism, homophobia, etc. In this way, the concept has also been able to infiltrate the left, where it is seen used alongside "xenophobia" and the other words terms listed above. The word, even semantically, with its clinical suffix "phobia", expresses a negative quality – it combines a critical analysis with the implication of mental illness. However, the crucial problem with the word "Islamophobia" is in fact that, unlike the other words that are similarly constructed, it applies to a set of opinions. Racism, homophobia and so on are words that speak to a disproportionate reaction to qualities intrinsic to an individual – the colour of their skin, their sexual orientation, etc. But Islam is not a race. Islam is a set of beliefs exactly like other sets of beliefs such as Christianity, communism, liberalism, conservatism, Nazism, Hinduism and many other widely divergent intellectual doctrines of a religious, political or philosophical nature. This is not altered by the fact that certain Muslims, including Islamists, are adamant in their conviction that their set of beliefs is particularly prominent and transcends conventional debate, development and revision. As has been said, democracy is based on the principle of "taking sides according to opinion" and "holding one view until you adopt another" – as two Danish democratic mottoes have it.

Philosophies are susceptible to open and ongoing criticism and revision – and if you are resolutely convinced of your position, you are naturally welcome to attempt to uphold this true faith in an unaltered form. But you cannot compel others to participate in this by asking them to refrain from criticism. And this is what the word "Islamophobia" attempts to do. In this, the Left has devised a useful attitude towards Islamism through its uncritical adoption of the term, with its intentionally manipulative and disciplinary effect. This approach has also served to paralyse the Left's own ability to reflect. Any criticism of culturalism is deflected and condemned as Islamophobia. And as such it is politically excluded.

The acceptance of the term Islamophobia is achieved precisely by invoking a hardline concept of culture, as if there were such a thing as homo islamicus. This is carried out in alliance with those various practices in Islam that specifically attempt to give the Muslim religion the quality of fate, requiring, in all circumstances, that male spouses in mixed marriages convert to Islam, while information regarding other choices is suppressed, and most importantly, respecting the prohibition against apostasy. (Apostasy is always punished, either by a fine, "reeducation", the confiscation of property and compulsory divorce from the husband or wife, by so-called "civil death" or even by real death). For this reason, "Islamophobia" leaves a particularly strange taste in the mouth. The word transforms religion into race.

The reason why intellectual Islamism has succeeded in infiltrating international forums, the political Left and liberal groups, is that is has been able to gain general acceptance of the cultural argument. This has been achieved through the popular anthropological concept of culture, culturalism. It is all the more harmful to the democratic debate, as it tends to de-politicise dogmas that are essentially political and thus leaves them open to criticism – and insult.


Political opinions are one-sided by nature: liberalism, conservatism, social liberalism, social democracy, socialism and so on all compete against each other – although as a rule they are united on a more basic level, each in turn confronting fascism, communism, Islamism and other totalitarian "isms". But if a set of dogmas, a political movement, is defined as "culture", there is a tendency for it to be immediately left in peace and for it to no longer be seen as a single partisan and discussible point of view among others. According to this concept, cultures are organic, irreducible totalities in themselves. Hence, cultures not only have a right to existence and a claim to respect – and to have privileges conferred on them – they also have a claim to protection and to the right to continue living in an unchanging way. This was made topical in the case of the caricatures in Jyllands-Posten, which were accused of insulting a culture.

In our book The Politics of Segregation,[2] we asked Islamists in multicultural Malaysia why they believed it was inappropriate to criticize, mock or hurt people holding different opinions and how one might instead behave when dealing with a subject such as Jyllands-Posten wanted to address. A director of studies in an Islamic university explained that it is necessary to first enter into dialogue with the party you wish to criticise, before anything is printed. With the case in question, Jyllands-Posten ought to have called, for instance, Islamisk Trossamfund ("The Islamic Religious Community" – a Danish branch of The Muslim Brotherhood) to seek permission. The consequences for democratic discussion that this implies are quite remarkable: if this approach were systematically carried out, any exchange of viewpoints would be removed from the public sphere and relegated to a closed forum, in order to prioritize mediation between parties. The result of protecting cultures in this way would be to close down open public discussion, and to abandon free debate among citizens.

If one were to follow this logic it would naturally have dramatic consequences on the way in which democracy functions. Jyllands-Posten caricatured scurrilous political ideas about using religion in the service of politics, as in Kurt Westergaard¹s famous cartoon of the prophet with a bomb in his turban. But the Islamists attempted to delegitimise Jyllands-Posten by accusing the newspaper of Islamophobia.

In all cases, whether in its reformist, revolutionary or terrorist variant, Islamism is in agreement that society should be organised according to the principles of Shariah. When this is categorised as "culture", it becomes possible to reject any exterior criticism as "Islamophobia" or "racism" because the critics are not "respecting" a "culture". Nazism attempted something similar when it presented itself as the continuation of ancient Germanic culture; however, in those days, the critics were sharper than the Left today, and were able to see through the rhetoric. We are now in the process of witnessing how Islamist movements such as Deobandi, Wahhabism, Salafism and The Muslim Brotherhood (directly influenced by Italian fascism and the French fascist Alexis Carrel) are protected by the "cultural" argument: they are not in fact political programmes, but in reality "cultures" which eo ipso cannot be criticised. But as soon as cultures enter the political arena, they must, by definition, be as accountable to discussion and criticism as all other associations, groups, parties and movements that make political demands. In this regard, neither priests, imams nor clerics – of any faith – have an ounce more right to respect than any other individual simply because they make use of divine rhetoric in their political demands.
  • [1] Brian Barry, Culture and Equality, 2001, 11-12.
  • [2] Jens-Martin Eriksen, Frederik Stjernfelt, Adskillelsens politik. Multikulturalisme – ideologi og virkelighed [The Politics of Segregation. Multiculturalism – Ideology and Reality], Lindhardt og Ringhof, Copenhagen: 2008.

Daily Dharma - Charlotte Joko Beck on Awareness


A nice Daily Dharma from Tricycle. This is the basic impulse of Buddhism - awareness.

Charlotte Joko Beck on awareness

There's an old Zen story: a student said to Master Ichu, "Please write for me something of great wisdom." Master Ichu picked up his brush and wrote one word: "Attention." The student said, "Is that all?" The master wrote, "Attention Attention."...

For "attention" we could substitute the word "awareness." Attention or awareness is the secret of life and the heart of practice....[E]very moment in life is absolute itself. That's all there is. There is nothing other than this present moment; there is no past, there is no future; there is nothing but this. So when we don't pay attention to every little this, we miss the whole thing. And the contents of this can be anything. This can be straightening our sitting mats, chopping an onion, visiting one we don't want to visit. It doesn't matter what the contents of the moment are; each moment is absolute. That's all there is, and all there ever will be. If we could totally pay attention, we would never be upset. If we're upset, it's axiomatic that we're not paying attention.If we miss not just one moment, but one moment after another, we're in trouble.

~ Charlotte Joko Beck, Nothing Special: Living Zen; from Everyday Mind, edited by Jean Smith, a Tricycle boo.

Craig Ferguson Speaks about Compassion as a Comedian

This is a moving segment from quite a while back, but I had never seen it before (thanks to Danny Fisher for posting this).

As Danny posted:
Our friend and past interviewee James Ishmael Ford reminds us, though, that anytime is a good time to post this particular clip. And to borrow James' extraordinary eloquence:
    ...While I don't actually know for whom I'm posting this, if it is you, good luck, friend...
(If after watching the clip you'd like more information or contacts, just follow this link.)





Friday, January 16, 2009

NPR - Does U.S. Need A Culture Czar?

Interesting topic.

Does U.S. Need A Culture Czar?

Listen Now [3 min 31 sec] add to playlist

Morning Edition, January 16, 2009 · The idea of a Cabinet-level official for the arts has gotten some buzz lately. After all, many other countries have ministers of culture. High-profile artists such as Quincy Jones think it's necessary in the U.S., but not everyone agrees.


Brain Mind Lecture 3 - Frontal Lobes

This is part three in the sequence of Brain Mind lecture series - part one and two can be found at the links.
Brain Mind Lecture 3 Frontal Lobes: Lobotomy, Catatonia, Mania, Depression, Obsessions, Compulsions, Perseveration, Confabulation, Aphasia - An Introductory Overview by Rhawn Joseph, Ph.D.

The video constitutes one of six Brain Mind Introductory Lectures, posted on youtube, each providing an introductory overview of the functional organization of the brain. To reduce confusion, all CT images have been reversed so damage on the left appears on the left, and right sided damage appear on the right. For a detailed presentation I recommend one of the best neuroscience texts of all time: the 2nd edition of Neuropsychiatry, Neuropsychology, Clinical Neuroscience, by Rhawn Joseph, Ph.D.





Ken Wilber on Enlightenment

This is a cool clip of The Ken talking about the quest for enlightenment and the nature of Spirit. This is taken from the new DVD Spirituality in the Modern World: A dialogue with Ken Wilber and Traleg Rinpoche. Essentially, he is presenting the Dzogchen version of enlightenment in this clip.




Archeogenetics - Our Past Within Us


The cross-disciplinary trend in the sciences is very cool. The more we see that things are interconnected, the better we will understand our world and our place in it. The new field of archeogenetics brings us one step closer.

Our Past Within Us

The new field known as archeogenetics is illuminating prehistory.

By Mark Williams

Prehistory: The Making of the Human Mind
By Colin Renfrew
Modern Library, 2008, $23.00

How did we become the thinking animals that we are? That's the question at the heart of the study of human prehistory--and the one that Colin Renfrew has been asking since the summer of 1962, when he travelled to Milos, one of the Cycladic Islands in the Aegean Sea, a source of the black obsidian that was the earliest commodity traded by humans.

Renfrew--Lord Renfrew of ­Kaimsthorn since he was made a British life peer in 1991 to honor his many contributions to archaeology--was then a graduate student at Cambridge. As an undergradu­ate, he'd first studied natural sciences before moving on to archaeology; thus, seeking a means to determine the provenance of the obsidian that prehistoric ­peoples favored for toolmaking, he tried the novel tactic of using optical emission spectroscopy to analyze its trace elements.

"We really hit lucky," Renfrew told me recently. "Obsidian makes much thinner, sharper blades than flint and so was a preferred substance found at almost all the early Neolithic sites in Greece. In fact, we learned it was already traded during the Upper Paleolithic." Yet the principal quarries for obsidian in the Aegean were on Milos. "So the material documents the earliest known seafaring," Renfrew says. "We needed nevertheless to be sure where it was coming from. Trace-element analysis let us characterize each different obsidian source, since they're created by relatively recent volcanoes and tend to be consistently distinguishable." Renfrew found that he could clearly graph how far the material had traveled: obsidian from a site in Anatolia (modern Turkey), in one instance, had been transported approximately 500 miles to Palestine. Overall, the picture that emerged suggested a world where most people never traveled more than a few miles from where they were born, but a few went everywhere. "It's an interesting picture," Renfrew says. "It was the seafarers who traveled distances, getting around the Aegean Islands quite widely and clearly doing that before the origins of farming."

Next, Renfrew turned his attention to what had been a cherished assumption in archaeology: that prehistoric cultural innovation originated in the Near East and diffused to Europe. "Just in archaeological terms, I didn't think that argument was very good," he says. "In Bulgaria and Romania, I'd been struck by the early metallurgy at some sites. So when radiocarbon dating arrived--particularly when tree-ring calibration came through in the late 1960s--the penny dropped." The new technological methods proved that, indeed, certain artifacts in Central and Western Europe were older than their supposed Near Eastern forerunners. Renfrew wrote a book, Before Civilization: The ­Radiocarbon Revolution and Prehistoric Europe (1973), pointing out that "the previous diffusionist chronology collapsed at several points."

Over the decades, Renfrew has remained at his field's cutting edge; he was among the earliest advocates of technologies like computer modeling and positron emission tomography (PET), the latter to examine contemporary subjects' brain activities as they replicated the toolmaking of Lower Paleolithic hominids. In his latest book, Prehistory: The Making of the Human Mind, Renfrew has not only produced a summary of by far the vaster part of human history but also provided an account of archaeology's advance since European scholars realized some 150 years ago that the human past extended many millennia further back than 4004 b.c.e. (the 17th-century theologian Bishop Ussher's estimate of when God had created the world). Given its vast subject and its strictures of length, probably the only real criticism one can make of the book is that in its index, under the letter R, the author is missing. It's a significant omission: Renfrew has informed today's understanding of human prehistory much as he says Gordon Childe--who is responsible for the concepts of the Neolithic and urban revolutions--shaped thinking during the first half of the 20th century. Like Childe, he has been one of the great archaeological synthesizers, working to construct a theory of global human development. For Renfrew, all archaeology ultimately leads to cognitive archaeology--the branch that investigates the development of human cognition.

In particular, Renfrew has been preoccupied by what he has dubbed the "sapient paradox": the immense time lag between the emergence of anatomically modern human beings and the advent of the cultural be­haviors that we take to define humanity.

Prehistory is defined as that period of human history during which people either hadn't yet achieved literacy--our basic ­information storage technology--or left behind no written records. Thus, in Egypt, prehistory ended around 3000 b.c.e., in the Early Dynastic Period, when hieroglyph-inscribed monuments, clay tablets, and papyrus appeared; in Papua New Guinea, conversely, it ended as recently as the end of the last century. Archaeologists and anthropologists accept this region-by-region definition of prehistory's conclusion, but they agree less about its beginning. A few have seen prehistory as commencing as recently as around 40,000 b.c.e., with the emergence of Cro-Magnon man, who as Homo sapiens sapiens was almost indistinguishable from us (although Cro-Magnons, on average, had larger brains and more robust physiologies). However, most experts would probably say that prehistory began in the Middle Pleistocene, as many as 200,000 years ago--when Homo neanderthalensis (sometimes classified as Homo sapiens neanderthalensis) and archaic Homo sapiens emerged. Either way, it's assumed that the appearance of Homo sapiens sapiens triggered "a new pace of change ... that set cultural development upon [an] ... accelerating path of development," as Renfrew writes in Prehistory. But Renfrew thinks that this acceleration must have been due to something else.

"The evidence that Homo sapiens' arrival equates with full linguistic abilities, the human behavioral revolution, and so on is very limited," Renfrew told me, adding that he sees nothing clearly separating the flint tools of the Neanderthals from those associated with Homo sapiens. As for the cave paintings at Altamira, Lascaux, and other Southern European sites, which are 15,000 to 17,000 years old: "They're amazing, but stylistically singular and very restricted in their distribution. They mightn't be characteristic of early Homo sapiens." Overall, Renfrew thinks, if aliens from space had compared Homo sapiens hunter-gatherers with their earlier counterparts, they probably wouldn't have seen much difference.

Two and a half million years ago, the first protohumans, Homo habilis, shaped stones to take the place of the claws and fangs they lacked, using them to kill small animals and scavenge the remains of larger ones. The payoff was immense: whereas metabolic needs like food processing constrain brain size for most mammals, eating meat enabled habilis to start evolving a smaller gut, freeing that metabolic energy for the brain's use. After a few hundred thousand years, later hominids like erectus and ergaster had developed straightened finger bones, stronger thumbs, and longer legs. The expansion of hominid brains--they were twice as big within a million years, three times by the Middle Paleolithic--enabled symbolic communication and abstract thought. By 50,000 b.c.e., our ancestors had spread from Africa through Asia, Europe, and Australia.

Archaeogenetics Emerges
The paradox, or puzzle, is this: if archaic Homo sapiens emerged as long as 200,000 years ago, why did our species need so many millennia before its transition, 12,000 to 10,000 years ago, from the hunter-­gatherer nomadism that characterized all previous hominids to permanent, year-round settle­ment, which then allowed the elabo­ration of humankind's cultural efforts? To answer this question, Renfrew calls for a grand synthesis of three approaches: scientific archaeology, which collects hard data through radiocarbon dating and similar technologies; linguistic study aimed at constructing clear histories of the world's languages; and molecular genetic analysis.
Read the rest of this article.


David Brin - Is the Web Helping Us Evolve?

This is an excellent article from David Brin, posted at Salon, on how technology (specifically, the web) may be shaping our current phase of evolution. It is certainly changing the ways our brains function, especially in kids who have been raised with the internet always being accessible. But the hysteria on either side (Google is making us stupid vs. the singularity is coming) is way off base. Brin brings some common sense to the issue.

Is the Web helping us evolve?

The truth lies somewhere between "Google is making us stupid" and "the Internet will liberate humanity."

By David Brin

Dec. 23, 2008 | Some of today's most vaunted tech philosophers are embroiled in a ferocious argument. On one side are those who think the Internet will liberate humanity, in a virtuous cycle of e-volving creativity that may culminate in new and higher forms of citizenship. Meanwhile, their diametrically gloomy critics see a kind of devolution taking hold, as millions are sucked into spirals of distraction, shallowness and homogeneity, gradually surrendering what little claim we had to the term "civilization."

Call it cyber-transcendentalists versus techno-grouches.

Both sides point to copious evidence, as Nicholas Carr recently did, in a cover story that ran in the Atlantic, titled, "Is Google Making Us Stupid?" In making the pessimists' case, Carr offered up studies showing that the new generation of multitaskers aren't nearly as good at dividing their attention effectively as they think they are. According to Carr, focus, concentration and factual knowledge are much too beneficial to toss aside in an avid pursuit of omni-awareness.

A related and even more worrisome trend is the decline of rigorously vetted expert knowledge. You wouldn't expect this to be a problem in an era when humanity knows more -- and shares information more openly -- with every passing year, month and day. Wikipedia is a compendium vastly larger than all previous encyclopedias combined, drawing millions to contribute from their own areas of micro-expertise. But the very freedom that makes the Internet so attractive also undermines the influence of gatekeepers who used to sift and extol some things over others, helping people to pick gold from dross.

In the past, their lists and guides ranged from the "Seven Liberal Arts" of Martianus Capella to "The Great Books of the Western World," from Emily Post's "Etiquette" to the Boy Scout Manual, from compulsory curricula to expert scientific testimony. Together, this shared canon gave civilized people common reference points. Only now, anyone can post a list -- or a hundred -- on Facebook. Prioritization is personal, and facts are deemed a matter of opinion.

Carr and others worry how 6 billion ships will navigate when they can no longer even agree upon a north star.

Of course, an impulse toward nostalgia has been rife in every era. When have grandparents not proclaimed that people were better, and the grass much greener, back in their day? Even the grouches' ultimate dire consequence has remained the same: the end of the world. Jeremiahs of past eras envisioned it arriving as divine retribution for fallen grace, while today's predict a doom wrought by human hands -- propelled by intemperate, reckless or ill-disciplined minds. The difference, from a certain angle, is small.

Take the dour mutterings of another grumbler, Internet entrepreneur Mark Pesce, whose dark rumination at last year's Personal Democracy Forum anticipates a dismal near-future commonwealth. One wherein expertise is lost and democracy becomes a tyranny of lobotomized imitation and short-tempered reflex, as viral YouTube moments spread everywhere instantaneously, getting everybody laughing or nodding or seething to the same memes -- an extreme resonance of reciprocal mimicry or hyper-mimesis. And everybody hyper-empowered to react impulsively at almost the speed of thought.

"All of our mass social institutions, developed at the start of the liberal era, are backed up against the same buzz saw," Pesce said. "Politics, as the most encompassing of our mass institutions, now balances on a knife edge between a past which no longer works and a future of chaos."

From there, it seems only a small step is needed to incite the sort of riled-up rabble that used to burst forth in every small town; only, future flash mobs will encompass the globe. Pesce's scenario is starkly similar to dystopias that science fiction authors Frederik Pohl and Cyril Kornbluth portrayed, back in the 1950s, as in "The Marching Morons," or Ray Bradbury in "Fahrenheit 451," with civilization homogenizing into a bland paste of imitation and dullard sameness, punctuated by intervals of mass hysteria.

Indeed, it is this very sameness -- the "flat world" celebrated by pundit Thomas Friedman -- that could demolish global peace, rather than save it. Arguing that an insufficiency of variety will eliminate our ability to inventively solve problems, Pesce dramatically extrapolates: "Fasten your seatbelts and prepare for a rapid descent into the Bellum omnia contra omnes, Thomas Hobbes' war of all against all ... Hyperconnectivity begets hypermimesis begets hyper-empowerment. After the arms race comes the war."

Wow. Isn't that cheery? Well, with Michael Crichton no longer around to propound that there "are things mankind was never meant to know," perhaps Carr and Pesce are auditioning to fill in, offering the next vivid anthem for a rising renunciation movement -- the nostalgic murmur that technology and "progress" may have already gone too far.

Responding to all of this -- on the Encyclopaedia Britannica Blog -- Clay Shirky, the technology forecaster and author of "Here Comes Everybody," presents an equally impressive array of evidence showing that the ability of individuals to autonomously scan, correlate and creatively utilize vast amounts of information is rising faster, almost daily. In the human experience, never before have so many been able to perceive, explore, compare, analyze and argue over evidence that questions rigid assumptions. How can this not lead to insights and exciting new breakthroughs at an accelerating pace?

Perhaps even fast enough to get us ahead of all our modern perplexities and problems.

Nor is this refrain new. From Jefferson and Franklin to Teilhard de Chardin and J.D. Bernal, the tech-happy zealots of progress have proclaimed a rebellious faith in human self-improvement via accumulating wisdom and ever-improving methodology.

Even some artists and writers began siding with the future, as when Bruno Bettelheim finally admitted that it was OK to read fairy tales, or when H.G. Wells stood up to Henry James over whether stories can involve social and scientific change. Confronting stodgy culture mavens, modernists and science fiction writers spurned the classical notion of "eternal verities" and the assumption that all generations will repeat the same stupidities, proclaiming instead that children can make new and different mistakes! Or even (sometimes) learn from the blunders of their parents.

Before the Internet was more than an experimental glimmer, Marshall McLuhan fizzed: "But all the conservatism in the world does not offer even token resistance to the ecological sweep of the new electric media."

Read more.


Newsweek - The Six Worst Diet Fads


Ah, the New Year, a time of fresh starts, lofty goals, and dumbass diets. If you have made the decision(resolution) to get healthy in 2009 [Good for you!], don't fall victim to one of these foolish and unhealthy diets.

I can speak with authority on the first one, which was popular back in the 1990s. Eating a very low or no-fat diet is about the stupidest thing you can do, but no one knew better back then (or at least I didn't). Hair loss, brittle nails, dry skin, and no sex drive are just some of the health side effects. More importantly, you need fat to lose fat - every cell in the body is made from fats, so if you don't have any in your diet, your body won't give up its stores because they will be needed to make new cells, a process going on 24-7.

Courtesy of Newsweek.

Worst Diet Fads

Barbara Kantrowitz And Pat Wingert

We take a look at this year's crop of trendy and potentially counterproductive weight-loss plans.

If you've visited a bookstore this month, you probably weren't able to avoid the giant pyramid of diet books that magically appears right after Jan. 1. Whether you find those titles inspiring or just guilt inducing, one thing is certain: many of them sound too good to be true. And often they are. These popular weight-loss fads usually aren't all that new, but they come around every year with new names—and maybe even a fresh celebrity connection. Here are the six basic diet trends, why they don't work and why they may even be unhealthy. You won't find these exact titles on a shelf near you, but hopefully you'll be able to recognize the gimmick when you see it no matter how it's been tarted up.

(For more on weight loss, check out this story about diet tactics that are backed by science: "What Works.")

1. The Fat-Free Diet
The theory: Eat whatever you want as long as it has no fat. If your diet contains no fat, you won't get fat.
Reality check: While it's true that extra fat in your diet adds calories, just sticking to foods touted as fat free doesn't necessarily help. Supermarket shelves are crammed with products advertised as fat free that are loaded with sugar and empty calories and that offer little in the way of fiber, vitamins or minerals. Check product labels before you buy.

2. The Snack-Pack Diet
The theory:
Cookies and chips sorted into 100-calorie packs help limit the damage from an attack of the munchies.
Reality check: The dozens of 100-calorie snack pack foods on the market now may offer a lower-calorie alternative, but few of them are truly healthy choices, and they aren't likely to be very filling or fiber rich, which can send you running for another bag or something less healthy. Rather than knocking back a 100-calorie bag of Cheetos or Oreos, look for more nutritious alternatives, like half of a sandwich made with whole-wheat bread or a piece of fruit.

3. The Couch-Potato Diet
The theory:
Who needs exercise? You can lose weight without working out! Cutting back on calories is enough.
Reality check: Who needs exercise? You do. Studies have shown that dieters who change what they eat and increase their regular activity are more likely to lose and keep weight off. Increasing activity has other health benefits as well, such as lowering your risk of heart disease, the leading killer of women. Even a brisk 20- to 30-minute walk most days of the week can make a big difference.

4. The Detox Plan
The theory:
You'll lose weight when you clean out your insides by downing a concoction made from orange juice and molasses or some other bizarre mix.
Reality check: There's no evidence that purging your intestines of "toxins" makes you any healthier or more likely to lose weight. A high-fiber diet is all you really need.

5 . The Beef and Bacon Diet
The theory:
All protein, all the time, and don't worry too much about fat.
Reality check: Cutting down on carbs, especially empty carbs like white flour and white rice, can help you lose weight, but a diet that contains large amounts of fatty meat simply isn't healthy. A better choice would be leaner proteins, like fish or chicken (grilled or broiled without the skin), plus five servings of fruit and vegetables, a serving or two of whole-grain carbs and some low-fat dairy.

6. The Twinkie Diet
The theory:
Calories are all that counts, so eat whatever you want, including nothing but Twinkies, and you will lose weight as long as your total is under your daily limit.
Reality check: Although calories do count, the source of those calories is important. If you eat only junk, your body will lose out on vital nutrients and that can have long-term consequences for your health. So watch those calories, and watch where they come from.

So instead of spending $25 on another book advocating one of the above approaches, try going back to basics.

The first, admittedly obvious, step is to eat less. You may not even be aware of how much or what you are eating. Keep a food journal for a few days. It will help pinpoint trouble spots. Studies have shown that writing down what you eat is one of the most effective ways to cut back and that dieters who do so tend to lose more weight than those who don't. One of the most helpful books out there on the psychology of why we eat is "Mindless Eating" (Bantam Dell) by Brian Wansink, head of the Cornell UniversityFood and Brand Lab. In addition to explaining the many reasons it's so easy to overeat in American culture, he offers tips that can help you reduce consumption more easily, like buying smaller plates or even just moving the candy dish a few feet farther from your desk.

The second step is also pretty simple: get more exercise. One upside of the current economic downturn is that lots of gyms are offering discounts on membership. Take advantage of that.

For more diet and exercise tips, check out mypyramid.gov. It's a free government site with easy-to-use diet and exercise trackers.


Thursday, January 15, 2009

David Spiegel - Coming Apart: Trauma and the Fragmentation of the Self

David Spiegel is one of the experts on Dissociative Identity Disorder and dissociation, an area of psychology that is still quite mysterious to a lot of people, including therapists. This is his Dana Foundation biography:

David Spiegel

David Spiegel, M.D., is the Willson Professor in the School of Medicine and Associate Chair of Psychiatry and Behavioral Sciences at the Stanford University School of Medicine. He collaborated in the inclusion of Acute Stress Disorder, a new psychiatric diagnosis in the Diagnostic Statistical Manual, Fourth Edition (DSM-IV), and chaired the work group on Dissociative Disorders. Among his more than 400 scientific journal articles and chapters and 10 books, he is the editor of Dissociation: Culture, Mind and Body (American Psychiatric Press, 1994), and co-editor of Traumatic Dissociation (American Psychiatric Publishing, 2007). His research on the health effects of psychosocial support was the subject of a segment on Bill Moyers' Emmy Award–winning special Healing and the Mind. He is Past-President of the American College of Psychiatrists and the Society for Clinical and Experimental Hypnosis.

The diagnosis of DID in trauma patients has sky-rocketed in recent years, but I suspect (as do many others) that what we are seeing is a lot of false diagnoses. The incidence of proven DID cases (what were once known as multiple personality disorder) is rare at best - and some would argue that since it is largely confined to North America, it is not a "real" disorder at all.

I don't reject the diagnosis outright, mostly because I know a therapist who is working with a true multiple (who has suffered unimaginable abuse and trauma). This patient has a "part" who bought and paid for a house which was unknown to the patient's main identity, or to her husband. Besides the anecdotal evidence, there is considerable evidence that in true multiples, there are distinct physiologies as well as identities, a mystery no one seems able to solve so far.

Anyway, here is the Spiegel article.

Coming Apart: Trauma and the Fragmentation of the Self

By David Spiegel, M.D.
About David Spiegel, M.D.
January 31, 2008

The controversial diagnosis of dissociative identity disorder (DID) has replaced what once was called "multiple personality disorder." People diagnosed with DID have trouble integrating their memories, sense of identity, and aspects of consciousness into a unified whole. New research supports the diagnosis and sheds light on what may have gone wrong in patients' brains, suggests David Spiegel, M.D. Spiegel, who chaired the professional working group that recommended the change of name in psychiatry's principle diagnostic manual, notes that the disorder likely stems from trauma and can be considered a severe form of post-traumatic stress disorder. Among the biological markers he describes are a smaller hippocampus and certain neurotransmitters. A better understanding of the importance of specific regions of the brain to memory and emotion may help push research forward.

In pop culture, “multiple personality disorder” is often portrayed as involving strategic, dramatic, and seductive battles among personalities that are uncomfortably sharing one hapless body. On TV crime shows and in movies the “split personality” is used as a dramatic excuse for mayhem or is feigned to evade criminal responsibility. Some believe that the disorder is the creation of credulous and overeager therapists. However, these and other common perceptions are mistaken. This article is written to set the record straight, to explain what this disorder is and what we understand about its causes, both in early life experience and in the brain. Some people do have what scientists now call “dissociative identity disorder” (DID), a name change made official in 1994, when the American Psychiatric Association published the fourth edition of its Diagnostic and Statistical Manual of Mental Disorders. Sufferers experience sudden loss of episodic memory, change from a sad, dependent, and helpless personality state to an angry, demanding, hostile one in seconds, and may find themselves in situations that they cannot understand. But they are the victims, not the authors, of their own fragmentation.

One "identity" may inflict physical damage on their body as "punishment" for another "personality" state, such as the patient who carved "I hate Mary," another of her identities, into her forearm with a knife. Mary was frightened and mystified about the injury. Such memory loss is often asymmetrical—one identity may be aware when another is prominent, but not vice versa.

The problem is not that there are "multiple personalities" existing in one body, as the old name of the disorder implied, but rather that the brain fails to integrate our different personae. We normally act like "different people" at work and at a party (hopefully), but we have continuity of memory and identity across the differences. Patients with DID do not. In fact, the problem is not that they have more than one personality, but rather that they have less than one—a fragmentation of self rather than a proliferation of selves.

People with dissociative disorders are like actors trapped in a variety of roles. They have difficulty integrating their memories, their sense of identity and aspects of their consciousness into a continuous whole. They find many parts of their experience alien, as if belonging to someone else. They cannot remember or make sense of parts of their past.

Dissociative symptoms involving alterations in identity, memory, consciousness, and body function are seen in cultures around the world, described as "ataques de nervios" in many Hispanic cultures and as states of trance and possession in China, Japan, and India. DID is not all that rare. It affects some 1 percent of people in the United States, 0.5 percent in China, and 1.5 percent in Turkey and the Netherlands, according to various studies in these countries.

Controversy has swirled around the disorder, in part because it is extreme and dramatic. But new research has helped us understand the origins of this tragic condition, as well as how it is reflected in the brain.

Roots in Trauma

Evidence is accumulating that trauma, especially early in life, repeated, and inflicted by relatives or caretakers, produces dissociative disorders. DID can be thought of as a chronic, severe form of post-traumatic stress disorder. The essence of traumatic stress is helplessness—a loss of control over one's body. The mental imprint of such frightening experiences sometimes takes the form of loss of control over parts of one’s mind—identity, memory, and consciousness—just as physical control is regained. During and in the immediate aftermath of acute trauma, such as an automobile accident or a physical assault, victims have reported being dazed, unaware of serious physical injury, or experiencing the trauma as if they were in a dream. Many rape victims report floating above their body, feeling sorry for the person being assaulted below them. Sexually or physically abused children often report seeking comfort from imaginary playmates or imagined protectors, or by imagining themselves absorbed in the pattern of the wallpaper. Some continue to feel detached and dis-integrated for weeks, months or years after trauma.

Abuse by a trusted authority figure such as a parent creates special problems. A child abused by a family member faces an ongoing dilemma: this beloved figure is inflicting harm, pain, and humiliation, yet the child is both emotionally and physically dependent. The child has to maintain two diametrically opposing views of the same person, which creates considerable tension and confusion, a situation described by psychologist Jennifer Freyd as "betrayal trauma."1 She showed that people prone to dissociation have selective amnesia for trauma-related words such as "incest." Freud wrote that “hysterics [his term for people prone to dissociation] suffer mainly from reminiscences." His point was that their often dramatic mental and physical symptoms were the product of early life trauma and conflict over sexually charged situations.

Can a Person Forget Trauma?

Humans process vast amounts of information. We can function only by being strategically selective in our awareness. To do otherwise would be like having every stored file in a computer open at once, or all the contents of one’s office file cabinets spread out on the desk at the same time. Emotional arousal typically leads to increases in recall—most of us remember September 11, 2001, with more than average detail. However, we frequently try to control our emotional response to traumatic events, sometimes at the expense of recollection of them. Chelsea Clinton, who was living in Manhattan on 9/11, wrote in a magazine article that she started walking downtown toward the World Trade Center after the attack but hours later found herself uptown, with no memory of how she had gotten there.

Research bears out that blocking emotion about a trauma can also block memory of it. Neuroscientists Larry Cahill, James McGaugh and colleagues at the University of California–Irvine had volunteers watch slides of an accident. Before seeing the slides, one group was given a beta-blocker, a drug that blocks the stress-induced increase in heart rate and blood pressure triggered by the sympathetic nervous system. These subjects’ arousal-related increase in recall was also blocked, compared to the recall of those subjects given a placebo rather than the beta-blocker.2 Other research goes a step further, helping us understand what happens in the brain when we suppress memories. John and Susan Gabrieli and colleagues at Stanford and Michael Anderson at the University of Oregon3 used positron emission tomography (PET), a sophisticated brain imaging technique, to study the brain’s ability to inhibit memory. When participants were asked to block their memory of word associations, PET imaging showed increased activity in the dorsolateral portion of the prefrontal cortex, the part of the brain that enables us to stop and think, coupled with decreased activity in the hippocampus, the structure deep in the brain that controls memory storage and retrieval.

Evidence that this inhibition of memory happens in real life is more than anecdotal. Linda Meyer Williams4 tracked down young women who had been treated in hospital emergency rooms for physical and sexual abuse an average of seven years earlier, during their childhood, and interviewed them about their history of trauma. Thirty-eight percent of them could not remember the episode that made a trip to the hospital necessary, although many discussed other episodes of abuse in detail. Another 14 percent reported that they had been unable to recall the traumatic episode for a period of time, lasting months to years. One would think that anyone actually brought to a hospital emergency room for treatment would recall the necessitating episode, yet a substantial minority could not. While voluntary suppression of emotionally laden memories is less likely to be successful than suppression of neutral memories, psychologist Martin Conway of the University of Bristol in England has found that when people are motivated to forget, they are more likely to do so for trauma-related memories than for neutral ones.5

The pressure to forget is greater when children are abused by a trusted caregiver, who might cue memory retrieval unavoidably. The only way to prevent persistent recall of damaging memories would be to adapt internally and to deliberately avoid thinking of such memories—in Freud’s terms, to push them away from consciousness. A study published in 2007 by Geraerts and colleagues at Maastricht University in the Netherlands6 provides additional evidence that some people simply do not persistently remember traumatic experiences. Forty percent of their sample of 98 people who responded to a newspaper advertisement about an abuse history reported discontinuous memories of it.

Why does this happen? For one thing, people naturally enter an unusual mental state during traumatic experiences. Their attention is narrowly focused. “The prospect of the gallows concentrates a man’s mind wonderfully,” Samuel Johnson famously noted. Mugging victims can often give a precise and detailed description of the assailant’s gun, but can describe little about his face. Dissociation can further isolate memories, by separating them from common associative networks in the brain that would make associative memory retrieval easier. Thus trauma can elicit dissociation, complicating the necessary working through of traumatic memories. The nature of the acute response may influence long-term adjustment.

Often people who have suffered trauma consciously try to suppress their recollection of the painful events. Over time the forgetting becomes automatic rather than willful, in the same way that riding a bicycle requires a great deal of conscious mental and physical effort during the learning phase but becomes automatic over time.

Trauma can be conceptualized as a sudden discontinuity in experience: one minute everything is fine; the next, one is in serious danger. This may lead to a process of memory storage that is similarly discontinuous with the usual range of associated memories, which might explain the "off/on" quality of dissociative amnesia, and its reversibility with techniques such as hypnosis. However, though dissociated information is out of sight, it is not out of mind. The information kept out of consciousness nonetheless has effects on it.

Insight from Post-Traumatic Stress Disorder

Many people suffering from PTSD are unable to recall important aspects of the trauma. Others feel detached or estranged from people afterward. Emory University psychiatrist Douglas Bremner found high levels of dissociative symptoms among Vietnam veterans with PTSD, and they also reported dissociating during combat.7

In a sample of 122 women seeking treatment for childhood sexual abuse, my research team found that a majority (66, or 54 percent) experienced PTSD symptoms. These women had more dissociative symptoms than those who did not evidence PTSD symptoms.8 Furthermore, among those with PTSD, dissociative symptoms were associated with higher levels of childhood abuse. Those with symptoms of dissociation also had more symptoms of physiological hyperarousal, such as a pronounced startle response after hearing a loud noise, suggesting that there is an association between psychological avoidance and physiological hyper-reactivity.

However, other studies provide evidence that dissociative detachment after a traumatic experience numbs the body as well as the mind. Psychologists Michael Griffin, Patricia Resick, and Mindy Mechanic at the University of Missouri studied women who had been raped. Within two weeks of the rape, women with PTSD resulting from the assault who reported high levels of dissociation during the rape had smaller increases in heart rate and skin conductance, each a measure of the autonomic nervous system’s stress response, during exposure to trauma-related memories. The women with PTSD but lower levels of dissociation responded with larger increases.9 Similarly, neuroscientist Ruth Lanius at the University of Western Ontario in Canada10, 11 studied people with PTSD and dissociative symptoms resulting from sexual abuse. Those with high levels of dissociation showed no increase in heart rate when read scripts with vivid descriptions of their trauma but had activation in the prefrontal cortex (which is responsible for thought and inhibition) and parts of the limbic system (which is responsible for emotion) on functional magnetic resonance imaging scans. Those with lower levels of dissociation responded with increased heart rates and less activity in those brain regions during this task.

Other studies reveal a distinction between the body’s immediate, neural stress response and the secondary, hormonal response. Dissociation after trauma is linked with higher levels of cortisol, a stress hormone that mobilizes glucose into the blood to assist with the fight-or-flight response, in the saliva, according to research in which cortisol levels were measured 24 hours after a stressful interview among adult women who were sexually abused during childhood.12 So while the immediate neural stress response system is suppressed by dissociation, the secondary hormonal stress response system is triggered by it.

What Happens in the Brain

Dissociative disorders involving fragmentation of identity, memory and consciousness seem less mysterious if we conceptualize identity as the product of mental effort rather than a given—a bottom-up rather than a top-down model of how the brain processes information. Neural systems that process the coincident firing of millions of neurons at a time must extract coherence from all this activity, and it is not surprising that in some cases these systems do not succeed. Neurons that fire together wire together, but building large, complex, and yet coherent neural networks may not always lead to a coherent sense of identity. Factors that restrict neurons from firing in association may limit the continuity of identity that emerges from experience and memory.

Hippocampal Volume

Another plausible neurobiological mechanism linking childhood trauma to dissociative difficulties with the integration of memory is smaller hippocampal volume. As mentioned above, the hippocampus, part of the limbic system situated in the middle portion of the temporal lobe, organizes memory storage and retrieval. The hippocampus is rich in glucocorticoid receptors, which are sensitive to stress-induced exposure to cortisol. Researchers have provided strong evidence in animals that early life experiences have lasting effects on the hormonal stress response system, either making it unduly sensitive to stress or protecting it from overreaction throughout life. Studies in humans show that while minor stressors may produce resilience, childhood sexual abuse does the opposite: it sensitizes the individual to subsequent stressors decades later. This research indicates that chronically elevated cortisol levels may damage the hippocampus, leading to smaller size and poorer function.

Imaging studies by Murray Stein at the University of California, San Diego, and Eric Vermetten at Utrecht University in the Netherlands have shown that people with a history of childhood abuse and dissociative disorders indeed have smaller hippocampi, and that the reduction in size correlates with the severity of dissociative symptoms.13, 14 Vermetten also found reductions in the size of the amygdala, the seat of fear and anger conditioning. Researcher Douglas Bremner found similarly smaller hippocampal size among veterans with PTSD symptoms. However, Harvard psychiatrist Roger Pitman proposed an alternative explanation for this relationship.15 He studied 35 pairs of identical twins, one of whom had been exposed to trauma and one of whom had not. Pitman found that smaller hippocampal volume is indeed a risk factor for PTSD severity, but is not affected by exposure to trauma. A smaller hippocampus, he reasoned, may underlie vulnerability to the development of PTSD symptoms rather than occurring as a result of trauma exposure.

In any case, a smaller hippocampus would likely limit a person’s ability to encode, store and retrieve memories and manage the emotions associated with them. The hippocampus is a context generator, helping us to put information into perspective. Wolf has shown that activity in the hippocampus buffers the effects of stressful input on the hormonal stress response system.16 Ruth Lanius demonstrated that those who dissociate in response to listening to accounts of their traumatic experiences have decreased activity in the brain adjacent to the hippocampus—they remember less and their brain memory systems are less active.11 Limitations on hippocampal size and function hinder memory processing and the ability to comprehend context, especially in light of contradictory memory encoding and storage. Among patients with PTSD and dissociative symptoms, research also indicates that there is higher connectivity between two portions of the brain—the right insula and the left ventrolateral thalamus—that are involved in perception of bodily processes and emotion and consciousness. This finding provides further evidence that both mental and physical distress are triggered by traumatic memories.

Neurotransmitter Activity

Neurotransmitters convey information from one nerve cell to another, and a specific one may be involved in dissociation. It has long been known that drugs that block the activity of the N-methyl-d-aspartate (NMDA) subtype of glutamate receptors in cortical and limbic brain regions produce dissociative symptoms, perhaps via a one-time release of glutamate. Anti-anxiety medications such as lorazepam stimulate the release of gamma amino butyric acid (GABA), a neurotransmitter that inhibits rather than stimulates activity in many regions of the brain. Yale researcher John Krystal has suggested that GABA may also play a role in dissociative symptoms. His work suggests that administering a drug that stimulates GABA increases dissociation.17, 18

Coming Together: Future Research on Dissociation

Two heads are not better than one when they share the same brain. The fragmentation of mental function that can occur after a series of traumatic experiences may both protect a person from distress and make it harder for the individual to put the trauma into perspective. As we come to appreciate the complexity of neural development, we also understand that early life experiences have a profound effect on the developing brain. In dissociation, achieving a sense of mental unity is such a difficult task that it can be disrupted by events that challenge body integrity, emotional control, and the development of relationships. Future research will reveal more about specific genetic vulnerabilities that may make certain individuals especially susceptible to the disorganizing effects of traumatic stress.

We also need to understand more about neural development and function: How do specific regions of the brain facilitate or inhibit memory, emotion, and their interaction? How can we use this knowledge to better treat individuals suffering from dissociation? Current treatments primarily involve psychotherapy, and increasing knowledge of brain structure and function may provide necessary connections for therapists and their patients, helping these individuals to understand and control their dissociative tendencies while working through the consequences of traumatic experiences. Other research may lead us to a specific medication that treats uncontrolled dissociation; at present there is none.

As we better understand control systems in the brain that underlie dissociation, we hope to enable people so that their response to trauma does not reinforce feelings of helplessness but rather augments their control over their identity, memory and consciousness.

back to top

References

1. Freyd, JJ, Klest, B, and Allard, CB. Betrayal Trauma: Relationship to Physical Health, Psychological Distress, and a Written Disclosure Intervention. Journal of Trauma and Dissociation 2005; 6(3): 83–104.

2. Cahill, L, et al. Beta-adrenergic Activation and Memory for Emotional Events. Nature 1994; 371: (702–704).

3. Anderson, MC, et al. Neural Systems Underlying the Suppression of Unwanted Memories. Science 2004; 303(5655): 232–235.

4. Williams, LM. Recall of Childhood Trauma: A Prospective Study of Women's Memories of Child Sexual Abuse. Journal of Consulting Clinical Psychology 1994; 62: 1167–1176.

5. Conway, MA. Cognitive Neuroscience: Repression Revisited. Nature 2001; 410(6826): 319–320.

6. Geraerts, E, et al. The Reality of Recovered Memories: Corroborating Continuous and Discontinuous Memories of Childhood Sexual Abuse. Psychological Science 2007; 18(7): 564–568.

7. Bremner, JD, et al. Dissociation and Posttraumatic Stress Disorder in Vietnam Combat Veterans. American Journal of Psychiatry 1992; 149(3): 328–332.

8. Ginzburg, K, et al. Evidence for a Dissociative Subtype of Post-traumatic Stress Disorder among Help-Seeking Childhood Sexual Abuse Survivors. Journal of Trauma and Dissociation 2006; 7(2): 7–27.

9. Griffin, MG, Resick, PA, and Mechanic, MB. Objective Assessment of Peritraumatic Dissociation: Psychophysiological Indicators. American Journal of Psychiatry 1997; 154(8): 1081–1088.

10. Lanius, RA, et al. Neural Correlates of Traumatic Memories in Posttraumatic Stress Disorder: A Functional MRI Investigation. American Journal of Psychiatry 2001; 158(11): 1920–1922.

11. Lanius, RA, et al. Brain Activation during Script-Driven Imagery Induced Dissociative Responses in PTSD: A Functional Magnetic Resonance Imaging Investigation. Biological Psychiatry 2002; 52(4): 305–311.

12. Koopman, C, et al. Dissociative Symptoms and Cortisol Responses to Recounting Traumatic Experiences among Childhood Sexual Abuse Survivors with PTSD. Journal of Trauma and Dissociation 2003; 4(4): 29–46.

13. Vermetten, E, et al. Hippocampal and Amygdalar Volumes in Dissociative Identity Disorder. American Journal of Psychiatry 2006; 164(4): 630–636.

14. Spiegel, D. Recognizing Traumatic Dissociation. American Journal of Psychiatry 2006; 163(4): 566–568.

15. Pitman, RK. Hippocampal Diminution in PTSD: More (or Less?) Than Meets the Eye. Hippocampus 2001; 11(2): 73–74; discussion 82–84.

16. Wolf, OT, et al. Basal Hypothalamo-Pituitary-Adrenal Axis Activity and Corticotropin Feedback in Young and Older Men: Relationships to Magnetic Resonance Imaging-Derived Hippocampus and Cingulate Gyrus Volumes. Neuroendocrinology 2002; 75(4): 241–249.

17. Morgan, CA, et al. Symptons of Dissociation in Healthy Military Populations: Why and How Do War Fighters Differ in Responses to Intense Stress? In E Vermetten, MJ Dorahy, and D Spiegel, eds., Traumatic Dissociation Neurobiology and Treatment, 157–179. Washington, DC: American Psychiatric Publishing, 2007.

18. Krystal, JH, et al. Toward a Cognitive Neuroscience of Dissociation and Altered Memory Functions in Post-traumatic Stress Disorder. In MJ Griedman, DS Charney, and AY Deutch, eds., Neurobiological and Clinical Consequences of Stress: From Normal Adaptation to PTSD, 239–269. Philadelphia: Lippincott-Raven, 2005.