jump to navigation

Truth 31 March 2006

Posted by Todd in Ethics, History, Philosophy & Social Theory, Religion.
comments closed

As some of you know, I participate in what's called "small group ministries" at the Unitarian church accross the street from my house. It's more or less a small group of people who get together a couple times a month to discuss spiritual, ethical, or personal issues. In some ways, i find it meditative, because it requires you to listen to others attentively and take their experiences and points-of-view seriously; on the other hand, I just find it a welcome social-spiritual outlet for me, as my life has become increasingly isolated over the past few years. I got this quote from our group's meeting last night. [A brief google search reveals that this quote is quite commonly known; I must not have paid attention to that day in my early-modern English lit course in college.]

Our faith and knowledge thrive by exercise, as well as our limbs and complexion. If the waters of truth flow not in a perpetual progression, they sicken into a muddy pool of conformity and tradition. The light which we have gained was given us not to be ever staring on, but by it to discover onward things more remote from our knowledge. Where there is much desire to learn, there of necessity will be much arguing, much writing, many opinions. Give me liberty to know, to utter, and to argue freely according to consciences, above all liberties. And though all the winds of doctrine were let loose to play upon the earth, so truth be in the field we do injuriously to misdoubt her strength. For who knows not that truth is strong, next to the Almighty, she needs no policies, no stratagems, to make her victorious. Let her and falsehood grapple; whoever knew truth [to be] put to th worse in a free and open encounter?
—John Milton

With all the beating the Enlightenment has taken over the past 35 years, I find myself increasingly drawn to the values and ethics of enlightened inquiry, even though I take the critiques seriously. Post-post-modernism, I find that I still hold dear this idea of free interchange of ideas, active seeking of truth, the ongoing flow of truth as humans work to find it. I would add to Milton only that truth is an event horizon, a calculus limit, a mirage in the distance, which dissipates as we move toward it. Truth is, nonetheless, the best possible end-in-view of our intellectual activity; we may never arrive, but an ethical search for truth is a good way to live out one's life.


Arts & Crafts Movement 29 March 2006

Posted by Todd in Culture.
comments closed

Yesterday I went to see a special exhibit at the newly completed and reopened De Young Museum of Fine art, in Golden Gate Park.

It was my first time to the museum since it was reopened last fall, and the architecture blew me away. The building has this incredible flow to it, with amazing amounts of sunlight coming through several terrariums on the inside. I wish I were better writing about art/architecture, because I just can't describe it. To be honest, I didn't pay much attention to most of the galleries because I was so awestruck by the space itself. I'll definitely be going back.

The special exhibit I went to see was of the International Arts and Crafts Movement from the turn of the 20th century.

I have always been drawn to that period of architecture and design, but have never really studied it or looked into the movement or its production in depth. There's something earthy and folksy about their design, yet it's thoroughly modern. As many of you know, much of my historical training focuses on what is called "modernity", the period of time when Western societies made the dramatic shift from local, individual production to industrial mass production, and when society had to completely remake itself to support the new modes of production. (The A&C movement started in England (1860s) much earlier than in the U.S. (1890s), which makes sense, given that slavery and sectionalism put us about 50 years behind in industrializing.)

What struck me was the connection between the A&C movement and the incredible transformation capitalism had wrought. These men and women (there were many women designers in the movement, the first in world history to have a significant proportion) felt that mass production had stolen the "soul" of the production of goods, disconnected human beings from the objects in their lives by removing them from their production. They looked to the past (especially medieval, or at least what they thought of as medieval) to inform their rejection of mass produced objects and mass produced design, everything from books, to clothing, to furniture, to textiles. And they looked to nature for design inspiration, as a way to reconnect themselves to the land they lived on, feeling that the modes of capitalist production had wrenched the people away from who they were.

In some ways, this was a romantic fantasy. They created a past which hadn't really existed (which is always the case of cultural revivals); they created a cultural movement only accesible to the elite (because of mass production's economic effects, hand-crafted objects had become incredibly expensive), and they weren't dealing with the reality of the way things actually had become (with the exception of America, which was somewhat less ambivalent or troubled by modernity, and actually mass produced its object and even its homes (you could buy a bungalow in a kit for $400 and assemble it yourself)).

On the other hand, both their philosophy and their art and design continue to resonate, because we still live in the wake of modernity. Sure, we've become more accustomed to throw-away goods, and it is clear that people make meaning out of mass-produced objects as much as they did of artisanally produced objects 150 years ago. Yet even now, there's no denying that the quality of our relationships to objects and, for the American A&C Movement, especially to our very homes, that has been irrevocably altered. What I love is their normative aesthetic, that every aspect of a living space, a home, should be a work of art. [Left: A recreation of a Stickley interior from the 1910s.]

On Knowing (More on Religion vs. Science) 25 March 2006

Posted by Todd in American Pragmatism, Cognitive Science, Philosophy of Science, Religion, Science.
comments closed

Among my most ardent interests is the study of how human beings know. I had never thought too much about the question until I began studying John Dewey in 1998. For Dewey, human knowledge is necessarily embodied and experiential. He called it the organism-environment model, where the embodied individual knows only in transaction with its environment, and where for humans, environment is broadly construed to include the social (other humans) and cultural (symbolic, meaningful, and linguistic) elements of experience.

The traditional philosophical epistemology was based in what Dewey called the "Specator Model" of knowledge, where philosophers (think: Plato) knew stuff from the "outside" as a disembodied spectator. Dewey is making a key distinction between objective reality (which exists independent of human experience) and our knowledge of reality (which is necessarily experiential). In the spectator model, the reality is knowable unmediated by our bodies and experiences, thereby lending authority to the claims of philosophers who "know" it, and setting up a foil against which unreal, false, and situated knowledge could be compared. To the contrary, Dewey (and William James, and Charles Pierce, and George Herbert Mead) argued that whether you are a philosopher, a scientist, an engineer, a farmer, a hunter-gatherer, or a housewife, you knowledge comes from the same place: in a transaction with your environment—it is always mediated through experience.[1]

James called this "pure experience," where the "vital flux of life" becomes the very raw stuff with which and about which we think; and the process of thinking about it produces any number of "objects" that we create to make sense of and to manipulate and change our environment.

Recently, my good friend Matt, in responding to my "Evolution of God" post of last weekend, raised important issues in the experience of knowing:

In other words, the things that we know (in this case mathematics) do not arise out of an embodied experience but exist independent of the human mind. 1+1=2 even if humans don't exist and even if it can't be proven without some fundamental intuitions about the nature of 1.

For me, there are actually two separate issues here. First, is their an objective reality outside of human experience? And second, how do human beings know that reality? On the first issue, this is a variation on the oldie but goodie, "If a tree falls in the forest and no one is around to hear, does it make a noise?" Is there an objective reality of the tree falling and of the air molecules compressing and spreading outward from the event in such a way as to create noise, or does noise only exist because we hear it? It is vital to understand that there are objective realities out there which are not directly experienced by any human observer, which nonetheless exist and are real. Ironically, our own experience tells us this is so. We can come upon the fallen log 100 years later and observe the consequences of its fall. The tree fell independent of human mind.

The second issue, however, is for me the interesting question. It isn't whether or not the tree actually made noise, but how we would know one way or the other. Or to use Matt's example, the interesting question isn't whether or not 1+1 exists outside of human experience, but rather, how human beings come to know that 1+1=2 and that it exists independent of our experience.

It is the process of knowing that is of issue. What is key here in the science and religion "controversy" is that human knowledge comes from the same place, regardless or whether you're trying to figure out how to build a $1.3 billion bridge from Oakland to Yerba Buena Island, or trying to figure out how to teach your kids to look both ways before crossing the street, or trying to figure out how mollusks breed, or trying to understand the emptiness you feel when you're home alone. From philosophy to theology to biology to parenting to engineering to athletics to auto mechanics, human knowing is necessarily embodied and experiential.

I've been reading an article on the development of American Pragmatism and its transformation into Neo-Pragmatism in the past couple of decades, and I stumbled upon this explanation of Charles Pierce's definition of belief, and I think it gives a nice summary of my points above:

[Pierce] construes belief as involving habits of action, and doubt as the unsettled state resulting from the interruption of a belief-habit by recalcitrance on the part of experience. This real, living doubt, unlike Cartesian paper doubt, is both involuntary and disagreeable. The primitive basis of the most sophisticated human cognitive activity is a homeostatic process in which the organism strives to return to equilibrium, a process beginning with doubt and ending when a new habit, a revised belief, is reached.

Peirce compares four methods for the “fixation of belief.” The method of tenacity is simply to cling obstinately to whatever beliefs you have, avoiding any evidence that might unsettle them. The method of authority is to have a church or state impose conformity in belief. The a priori method — the method traditionally favored by metaphysicians — seeks to settle belief by inquiring what is “agreeable to reason.” But the most sophisticated inquirers, aspiring to indefeasibly settled belief, will always be motivated to further inquiry, never fully satisfied with what they presently incline to think. So the culmination of that primitive homeostatic process is the scientific method — the only method, Peirce argues, that would eventually result in beliefs which are indefeasibly stable, permanently safe from recalcitrance.

If you really want to learn the truth, you must acknowledge that you don’t satisfactorily know already; so the scientific inquirer is a “contrite fallibilist” (CP 1.14) ready to “drop the whole cartload of his beliefs, the moment experience is against them” (CP 1.55). In the Critical Common-sensism of Peirce’s mature philosophy — an attempted synthesis, as the phrase suggests, of Kant’s and Reid’s responses to Hume — the scientific inquirer is seen as submitting the instinctive beliefs of common sense to criticism, refinement, and revision.[2]

What the pragmatists are arguing isn't that we give up theological or philosophical debates, but rather that we move past the old and untrue epistemologies and ontologies that make us think we are pronouncing ultimate truth. In our philosophical and theological and literary and artistic debates, and more importantly, in our values and moral debates, we must acknowledge where our knowledge comes from and approach our meaningful questions—who am I, why am I here, what is the meaning of life—beginning with a "scientific mindset" (as Dewey called it) or a "scientific attitude" (as Pierce called it). This doesn't mean that we cede all knowledge production to scientists, rather it means that we approach our quests for knowledge with the understanding of the limits of our knowledge and how we get it; that when we make existential or moral or theological claims, they be made by giving reasons and arguments and evidence; that they be explicitly anchored in our experience in this world rather than in our efforts to create certainty and stability by generating knowledge that doesn't correspond to our experiences.

The human capacity to imagine and to think, to begin with experience, create thought-objects, and then imagine all their possibilities—our ability to see thought-objects as infinite possible means—allows us to imagine ourselves into cultural structures that are maladaptive, when we rely on our tenacity, on authority, or an a priori knowledge without accounting for our experience. At best, such disconnected knowledge-systems (cultures) can be merely odd; at worst they can be immoral and even violent. In the pluralistic world we live in today, nothing could be more dangerous.

[1] This is a similar mistake to that made by much of postmodernism and poststructural theory. Where both pragmatic and postmodern theories of knowledge are anti-foundationalist (i.e., deconstructed)—that is, there is no correspondance between signified and signifier—the postmodernists end up making the same mistakes as Plato and most Western philosophers. They deal with knowledge as if it exists outside of lived experiences. Derrida and the social theorists who rely on him, such as Judith Butler, all explain the change of knowledge as originating in the non-correspondance of the signifier. And in sociology, the cultural sociologists make a like mistake, assuming that 'culture' exists independent of human bodies and experiences. The pragmatists, which seeing truth as a process and as necessarily experiential, also insist on the body. The linguistic turn of the postmodernists ultimately fails to describe the actual process of knowledge production in human individuals and groups, because it treats language (discourse) as prior to and structuring experience. This gives far too much power to language and to already-held knowledge at the moment of experience: The postmodern position ignores the embodied process of knowledge production in the first instance.
[2] Susan Haack, "Pragmatism, Old and New" in Contemporary Pragmatism Vol. 1, No. 1 (June 2004), 3-41.

Who(m) did you exploit today? 22 March 2006

Posted by Todd in Capitalism & Economy, Cultural Critique, Inequality & Stratification.
comments closed

Oh my god, I'm speechless. Even if you love capitalism, this is just freakin' funny.

This is from a show on MTV2 called Wonder Showzen, which I have never heard of until this morning on Salon.com.

Beat Kids 104 

The Evolution of God 19 March 2006

Posted by Todd in American Pragmatism, Cognitive Science, Evolution, Philosophy & Social Theory, Religion, Science.
comments closed

Although Western philosophers and scientists have been asking why human beings believe in God for a few hundred years, it's only been recently that scientists have begun actually testing hypotheses and seeking biological explanations for belief in the supernatural. It seems pretty clear at this point that we have evolved a capacity to believe in the supernatural; what is unclear is why and how.

One strand of thought is about the evolution of religion as culture. This is compelling as an idea, that religions "evolved" over time, but there are some drawbacks in the way this argument is being articulated at the moment. The foremost proponent of this approach, Daniel Dennett, whose new book Breaking the Spell is making the rounds, argues bascially that religion, like all phenomena, can be studied scientifically and should be evaluated based on its merits like any other cultural formation.[1] So far so good. Where Dennett looses me, however, is in his reliance on an odd idea of the cultural "meme." Richard Dawkins proposed in the early 1980s that cultures might work in similar ways to genes, in discrete units of meaning which are, like genes, struggling to survive; and so cultures can be studied in the same way as organisms, using biological evolution as a metaphor.

While there are some interesting possibilities here, there are also some serious problems with this metaphor; unfortunately, many evolutionary biologists have seized hold of this metaphor and are making fundamentally flawed arguments. Here is a perfect example of where scientists of different orders need to take each other's research seriously. Cultural Anthropology and Sociology have been studying the formation of cultures and their changes over time for 150 years. But to read Dennett, it's as if none of that work exists. Dennett speaks of religious "memes" as acting independent of their "host organisms" in order to ensure their survival; cultural bledning is explained as these memes adapting to new conditions to survive. My two preliminary objects are 1) that cultures do not exist apart from or independent of human bodies or experience, and 2) cultures do not brachiate or evolve in a manner parallel to how organisms evolve. On the first objection, human cultures are embodied; they arise out of the experiences of embodied individuals in transaction with their environment, physical and social. Some aspects of culture are designed in such a way to shape perception and prevent their transformation; but the problem with Dennett's interpretation is that cultures do transform. Human beings "trip" over things in their environment constantly, which forces them to a) change their environment, b) change their belief, or c) a combination of the two.

On my second objection, bits and pieces of human cultures move around and among various cultures based on the social, cultural, and personal desires of the "hosts." While some of these desires may be shaped by certain cultural expectations, they can also be agentively contradicted, reformulated, rejected. Sometimes, humans even create new beliefs out of whole cloth to deal with new experiences, completely laying aside without warning previous beliefs. Human brains are incredibly plastic and curious, and they readily adopt cultural bits from each other that "work" for them. In short, cultural blending occurs through human transaction with their environment (broadly defined) and with each other.[2]

I do not mean to overstate my objection—I agree with Dennett that some cultures are structured in a way to encourage their survival. But I disagree on how those beliefs come to be structured, how they change, and how they blend. For the study of the the evolution of religion, biologists and biological philosophers need to team up with scientists who actually study culture (anthropologists and sociologists) to work this out in more complex, accurate and satisfying terms.

More compelling and ultimately, to me, more interesting are several studies in Cognitive Science, which are now looking at the brain structures of religious belief. These hypotheses take several related forms. On one hand, the idea is that religion evolved by accident as a by-product of our cognitive evolution, the development of our brains. This is explained in two ways. First, our brains are set up to make distinctions between our bodies and the "outside" world. Through studying babies, researchers have found (in sharp distinction to Jean Piaget's theories, which have until recently been taken as gospel) that human babies as young as 6 months impute intention to other humans but not to objects (they even understand gravity!). This means that babies make a distinction between minds with intentions and inanimate objects. By 1 year old, a baby can follow the gaze of another human being and understand emotional facial expressions. Babies, then, understand both the physical, obdurate world and the social world of humans. Cognitive scientists are now hypothesizing that these two brain processes—interpreting the physical world and interpreting the social world—overlap in our brains. The physical world interpretation seems to be more primitive and is shared by many other animal species, whereas the social interpretation seems to be a more recent evolution and is shared only with other primates, if at all. The hypothesis is that the ability to see the physical world as separate from mind allows us to envision mindless bodies and bodiless minds. A second hypothesis is that our social interpretation system "overshoots, inferring goals and desires where none exist."[3] In this view, the mind-body dualism as envisioned by everyone from ancient Chinese ancestor worshippers to Plato and everyone in between and since arose not out of faulty culture and perception of reality, nor out of Freudian wishfulfillment. Rather, it is a natural (if somewhat maladaptive) result of the overlapping systems of sensory interpretation: The benefits of being able to see the physical world as separate from our minds and of being able to interpret the intentions and desires of others around us have enabled the dominance of our species. Religion is an accidental by-product, or as evolutionary scientists call it, a spandrel.

On the other hand, some cognitive scientists are hypothesizing that the religious impulse may itself be adaptive. But I'm waiting to hear more of this research, before I decide what I think of this hypothesis. In any case, one proponent of this view, Jesse Bering, has found that even avowed atheists resort to supernatural thinking, often unknowingly, when they are asked about the desires and emotions of dead people (that is, there is a discontinuity between responses about biology of the dead and responses about emotion of the dead). His research demonstrates that our minds are exceptionally susceptible to what he calls "ghost stories," or stories that impute intention and desire to dead individuals. In Berring's view, belief in the afterlife and the supernatural may give social advantage by structuring social interaction by minimizing selfish behavior, thereby increasing the possibility of survival of the group.[4] Right now, I personally find this hypothesis less plausible because of the work that has been done on human altruism lately, which seems to arise quite naturlaly by itself out of, again, the development of the human social brain.

Ultimately, I agree with Dennett's moral philosophical position: 1) religions are natural parts of our evolution[5], 2) many religious belief systems are maladaptive and immoral in the world we actually live in; and 3) religions must be examined, evaluated, modified or rejected accordingly.

[1] Dennett is doing the interview circuit to promote his book, and if you don't have time to read the rather dense book, have a go at one of these: KQED's Forum and Point of Inquiry.
[2] A much better explanation of how human cultures evolve in parallel with their biological evolution and, more to the point, with their experience in the enviornment can be found in the work of Robert Boyd and Peter J. Richerson, The Origin and Evolution of Cultures (New York: Oxford University Press, 2005). Briefly, Boyd and Richerson have discovered using statistical modeling that human beings tend to function on a balance between innovation and conservatism. Conservatism is adaptive in an environment where the variables are known and where survival is assured. But when the environment changes, the species must also have individuals who are innovators and creative thinkers, who can adapt to the new environmental conditions. This includes both physical environment and social environmental pressures. A third aspect has to be added in, that of the human capacity to learn and transmit knowledge from generation to generation, not just of simple tools (like chimpanzees do), but complex and detailed knowledge of how to survive in particular environments. The miracle of human cognition and culture is the incredible capacity and plasticity of human beings to learn from each other, innovate and conserve.
[3] I'm taking this summary from Paul Bloom, "Is God an Accident" in The Atlantic, December 2005, 105-112.
[4] See Jesse M. Bering, "The Cognitive Psychology of Belief in the Supernatural" in American Scientist, March-April 2006, 142-149.
[5] This is in contradiction to Stephen J. Gould's famous argument that science and religion occupy two separate "magisteria," which do not overlap and describe different aspects of life. I have come to reject this notion, and now find that all human knowledge is overlapping and related, and that no aspect of human life is beyond scrutiny from any direction.

Racial Justice, Part 2: Race and Biology 18 March 2006

Posted by Todd in Biology, Evolution, Inequality & Stratification, Philosophy & Social Theory, Race & Ethnicity.
comments closed

A Jewish American couple (straight or gay, doesn't matter) adopt a baby from China. The couple name the baby Rachel, raise her in the United States, bring her up Reform Jewish in Los Angeles, bat mitz'voh her at 13; she graduates from high school and attends college at NYU. If you met her, would you think she was Chinese (or at least Asian)? Would Israeli immigration consider her Jewish? When she fills out her census form or a job application, should she check Asian American?

In traditional racist ideology, Rachel is once and for all, Chinese. She cannot escape her "true" heritage, her "bloodline," indeed her "essence." Racism claims that someone's true inner nature is visible in their physical characteristics and that human being can be ranked hierarchally (in morals, civilization, and intelligence) based on observable physical features.

And yet when examined for cultural or personal characteristics, other than a handful of physical traits—eye shape, hair color and texture, and maybe skin color, depending on her birth parents—Rachel has nothing more in common with a Chinese person than I do. This was really the watershed anti-racist argument Franz Boas made in the early 20th century (which ultimately got him kicked out of Germany), that race is merely an issue of appearance and is no more salient to culture and individual character than different hair color or body weight. Boas got the ball rolling on a massive amount of science which has been building over the past 75 years to prove that we are actually all the same species.[1]

Of course there is much complexity here, because human beings create meanings based on experience; and if you live in a culture that treats you differently based on your physicaly appearance, you will have to make adjustments—social, cultural and personal—to deal with that differential treatment. And so the cultural belief in race can actually create what I think of as racial ethnicities. America's history of racism and its institutions (especially slavery and segregration) served as a context wherein those individuals who were marked different, i.e., black, had to create cultural and social means of dealing with their inferior position. In some ways, socially, it doesn't matter that race per se isn't biological, because race is an observable social effect. In the contemporary United States, the underlying belief that physical traits matter, that they are salient for categorizing human beings, has been leading to the emergence of pan-ethnic identities, most notably Asian American, Native American, and Latino. Obviously, people within any of these three groups (let alone African American and European or Slavic American) are diverse and ultimately these categories fail; and yet again, they do function as social effects of racism. And as if that weren't complicated enough, the ethnic identities of individuals and groups within any of these categories still depend on the salience of physical characteristics, such that the very processes of claiming an ethnic identity and drawing an ethnic boundaries rely on racist assumptions.[2]

In a nut shell, if you say that Rachel is Chinese, you are accepting racist beliefs about the salience of physical characteristics to identity. Or say her parents had sent her to Chinese Camp during the summer to learn about "her true heritage," they would be accepting the racist notion that culture (heritage) flows from race (biology).[3]

On the other hand, it is obvious to any human being that we look different from each other and that, at least historically, our physical characteristics have been broadly grouped into populations. What those physical characteristics are and which ones count for grouping which groups together can get incredibly messy. We now live in a world where global migration is increasing, and when populations do not move they are still confronted with cultural (racial) differences daily through the global media, and where inter-ethnic marriages and blended cultures are increasingly common. But this is a world where racist beliefs still structure societies around the world, although each society may have a different way of enacting and organizing its racism, as it has arisen out of a different historical process. Might a return to the biological study of "race" have an anti-racist effect on a world which seems to have gone mad with ethnic conflict?

Unfortunately, the history of anthropology (and other social and biological sciences) is such that a return to biological study of race is fraught with controversy. In the early 1970s, when Eward O. Wilson suggested that animal behaviors are connected to their biology, the outcry was immediate and harsh. Lucky for him he already had tenure at the time.[4] But scientists working on the evolution of the human species have started to look at race in different ways, which seek to explain why we look different from each other and yet how we are all deeply, biologically connected.

The standard argument has been that different physical characterstics evolved as adaptations to particular environments. Some of these things make sense to me, for example, the presence of the sickle-cell gene in populations where malaria is prevalent. It turns out if you are a carrier of sickle-cell but don't have the disease, you have a built in resistance to the bacteria that cause malaria. This is true especially of equatorial Africans. Another example, Europeans have a lactase gene that allows them to digest milk into adulthood, whereas most humans get sick from drinking milk as they get older.[5] However, this argument looses steam when you reach the level of observable characteristics, such as skin color. A hypothesis tossed around for years has been that in northern latitudes with less sun, you need lighter skin for vitamin D production. However, africans living in the north do not suffer from vitamin d deficiency and eskimos have brown skin and cover their bodies almost completely.

The mos recent hypothesis for things like eye shape, hair texture and color, nose shape, skin color, and even penis size is sexual selection. Many species have certain mating rituals and criteria which are not in fact biologically adaptive. The standard example of this is the pea cock's tail, which makes him vulnerable to predators but makes him quite fetching to the peahens so that he can mate. The hypothesis is simply that as humanity has evolved over the past 50,000 years, different populations with different cultural value systems and different aesthetics simply produced their own physical traits through generations of selective breeding. It will be interesting to watch these developments over the coming years. [6]

But for my question of how biology of "race" might help actually alleviate racism, I am drawn to the more broad study of the evolution of humans as a species. The Genographic Project, run by the National Geographic Society, is using blood samples from indigenous peoples around the world to trace the genetic changes in mitochondrial DNA and in Y chromosomes, both of which pass with virtually no genetic blending from generation to generation. This allows scientists to take these genes in mDNA and Y chromosome and, using the "molecular clock" technique, measure the length of time populations have been separated from each other and their migration patterns. This migration out of northeastern African took place after we were fully human. What strikes me about these findings is that we are human and we have a shared history.

Unfortunately, some cognitive science has found that our brains may be set up to know enemies. Preliminary research has indicated that across cultures, by the age of 5-6, children have the same fear and anxiety responses, or relaxation and comfort responses, to given populations of people. It is clear that these responses are learned (as they match those of the parents), but it is highly suggestive that our brains are actually structured to learn such responses. If this is true, in smaller societies, this was probably a useful adaptive feature of our brains, helping communities defend themselves and survive. But in the modern world, if this turns out to be true, this could be one of the most maladaptive features of human biology, as we continue to kill each other based on "race" and ethnicity. If this turns out to be the case, platitudes about "shared histories" and "we're all one species" will probably have little effect on the continuing violence, especially considering the scale of the killing technology at our disposal, unless peoples around the world can be taught multicultural values at very young ages.

[1] See Franz Boas, Race, Language and Culture, published in 1940, in which he summarized his anti-racist observations and arguments to move anthropology as a discipline away from unscientific racism.
[2] See Kwame A. Appiah Color Consciousness for an amazing exploration of the complexities of identity formations based on physical characteristics, how they break down under scrutiny, and why people form those identities anyway. See Orlando Patterson, The Ordeal of Integration: Progress and Resentment in America's "Racial" Crisis for a detailed examination of how the liberal anti-racist ideal clashes with the ethnic identity formations within racial groups with a preservationist-multicultural value.
[3] For a fantastic exploration of the dynamics of authentic ethnic identity formation, I highly recommend Vincent Cheng, Inauthentic: The Anxiety over Culture and Identity.
[4] See Edward O. Wilson, Sociobiology for a look at the inflamatory work. The introduction to the 25th anniversary edition explains the controversy.
[5] These issues were discussed briefly in a segment on NPR's Science Friday on March 10th with Jonathan Pritchard, Professor of Human Genetics at University of Chicago.
[6] Richard Dawkins covers this hypothesis well (and quite entertainingly) in his newest book, The Ancestor's Tale: A Pilgrimage to the Dawn of Evolution.

Remember when we had the Right to Privacy? 17 March 2006

Posted by Todd in Political Commentary.
comments closed

Click here.

[Thanks be to Randy for the link.]

A Return to 1950s Anti-Gay McCarthyism 16 March 2006

Posted by Todd in Gay and Lesbian History, Gay Rights, History, Homosexuality, Inequality & Stratification, Politics.
comments closed

Salon.com's War Room reports this morning that the Bush Administration has changed the wording in the guidelines for legitimate reasons to deny security clearances. Apparently, whereas the guidelines used to say that it was unlawful to deny a security clearance based on sexual orientation, the new guidelines have added the wording "solely based on" sexual orientation. While this may not seem like that big of a change, it opens a loop-hole that will allow the administration to deny or revoke security clearances to individuals who are gay, lesbian, bisexual or transgendered. To highlight why this is problematic, consider how one might react if the guidelines read that security clearance could not be denied "solely based on" the race of the individual.

During the 1950s, thousands of gay men and women in Washington, D.C., were fired from their jobs, because of their sexuality. Police raided bars, followed men in parks, and even surveilled private homes for evidence. More commonly, the FBI blackmailed individuals by interrogating them and threatening to out them publicly and fire them unless they gave lists of people who were homosexual. The witch hunt was not unlike the current atmosphere in the military. (For an excellent history of this period, I highly recommend David K. Johnson, Lavender Scare: The Cold War Persecution of Gays and Lesbians by the Federal Government.)

And equality for sexual minorities takes yet another step backward.

Moral Art vs. Moralizing Art: “Munich” and Violence 12 March 2006

Posted by Todd in Cinema, Ethics, History, Judaism, War & Terrorism.
comments closed

A movie that successfully asks difficult and complex moral questions is rare. It is far too easy for art to fall into moralization, rather than morality. Moralizing art tells us the right answer, so that believers feel comforted in their moral superiority and unbelievers will see the error of their ways and experience a conversion. But moralizing art is never good art. Rather than fostering an opening of the heart and mind, encouraging a careful and compassionate consideration of difficult issues, it feeds us the moral outcome as if we were children in Sunday School. In order to make its point, moralizing art must rely on piecing together images and ideas in nearly propagandistic ways; in movies, this means easily-recognizable and readily intelligible representations that require no subtlety of thought, setting up situations that emotionally resonnate but are not in fact realistic, and most aggregiously in film, giving us two-dimensional characters that are actually no more than stereotypes. This year's winner of the Best Picture Oscar, "Crash", is such a moralizing film, reducing characters to stock types, and putting them in situations where, of course, their Evil is made clear. Steve Lopez of the Los Angeles Times wrote today a great response to "Crash's" boosters: Race relations in today's Los Angeles simply don't work the way they are portrayed in the film. For me it is far more simple: "Crash" is moralizing art, and therefore bad art. It hits the viewer over the head with dumbed-down, simplistic moralisms, which aren't helpful at all in understanding the realities of race relations or drawing moral conclusions about race.

Moral art, unlike moralizing art, must be firmly anchored in realistic situations, must represent human beings in their complexity, their moral ambiguity, and show that in real life, morality is not clear and easy, but messy, dirty, and often bloody. Real human beings make morally wrong decisions constantly. Good people do bad things, and vice versa. Steven Spielberg's "Munich" is a much more successful moral film. What I found impressive from the first 20 minutes of the film is the equanimity with which the violence was portrayed. There was no difference in style, technique, or point of view between Palestinian-perpetrated and Israeli-perpetrated violence. The film focuses on the Mossad group that is hunting down and killing those whom the Israeli government had pointed out as the planners of the Munich murders. The characters (and the audience) must grapple with the possibility that what the Mossad assassins were doing was, in fact, immoral. At the most basic level it asks what kind of response to violence is justifiable.

Because of the focus on the Mossad group, the audience is never asked to consider the moral issues from the Palestinian side. And so the movie fails as an examination of the nearly 100-year-old Palestine-Israel conflict (war?). Although it might be too much to ask a film about a group of Israeli assassins to equally humanize and explore the Palestinian point-of-view, I found the moments when Palestinians were represented to fall back into the moral ease of stock characters giving stock speeches. For example, as the team cases out a French-Palestinian's apartment to plant a bomb, his wife delivers a shrill speech about Palestine's suffering; and again, a PLO agent working with the KGB delivers an even more shrill speech to the Bana character. To the extent that these two scenes work at all, it is because of the effect they have on the main characters, who are visibly troubled by confronting real human beings whom they must kill. But these scenes do little to humanize the Palestinians for the audience. So this is not a good film about Israel-Palestine, and should not be interpreted as such. But that should not be grounds to dismiss "Munich" as a failure.

Rather, where the movie succeeds as moral art is in the gradual transformation of the main characters, as they confront what they have done and the implications of violence for violence's sake. When you talk with a man in his home and listen to his wife talk about the suffering of her people, and listen to his daughter play the piano, what then does it mean to murder him? What if he wasn't even involved in the crime you are murdering him for? And most poignantly in the film, what does it do to you to kill him? In other words, does perpetrating violence, even when you believe yourself to be morally justified, come back to damage you, to destroy your own moral self.

Some have dismissed the film as only so much "liberal Jewish handwringing," but if I were Spielberg, I would take that as a compliment. What is most remarkable and humane and worthy about liberal Judaism (and for that matter, liberal Christianity and liberal Islam) is its willingness and indeed its insistence on moral handwringing. Religion that teaches moral absolutes, a black and white world, is a religion that will easily fall into violence, be it social, cultural, or the infliction of bodily harm. Easy morality allows violence against "enemies" and clearly defines who those enemies are: anyone who is not like us. Liberal strands of Judaism, over the past 200 years or so, have stepped out of tribal formulations of ethnic identity and asked what it means to be a Jew among human beings. From an early script of "Munich" available online (the dialogue in the finished movie—where punctuation doesn't count—was more precise and polished):

We're Jews, Avner, Jews don't do
wrong because our enemies do wrong.

We can't afford to be that.. .
decent anymore.

I don't know that we ever were that
decent. Suffering thousands of
years of hatred doesn't make you
decent. But we're supposed to be
righteous. That's what I was
taught, that's Jewish, that
beautiful thing. That's what I
knew. Absolutely.
And I think I've lost that. Avner.
I've lost that too.

Oh that's, that's —

That's everything. I've lost
everything. My, my soul.

Ultimately, the film shows men who are transformed by killing. They become paranoid, haunted, detached. They are morally mangled as they systematically kill other human beings. I suspect that on both sides of any conflict the oucome is the same, unless you have forced yourself to believe in the facile morality that justifies without question or reflection the perpetration of violence. I suppose the ultimate question, and perhaps the most fearful one, is whether someone who believes the facile morality, someone who refuses the moral question and kills or maims believing they are doing the Will of God or that they are fulfilling their patriotic duty actually feel the impact of taking human life. Palestine-Israel or U.S.-Al Quaida: one soldier facing one sniper—one insurgent with one hostage—one suicide-bomber on one bus—one military pilot and one apartment building—one assassin and one target.

Racial Justice, Part 1 9 March 2006

Posted by Todd in Biology, Inequality & Stratification, Philosophy & Social Theory, Politics, Race & Ethnicity.
comments closed

[I've been reading and thinking a lot about racial justice in the United States over the past year or so, so this topic will probably come up from time to time for the next little while. These are ideas in process and I'm struggling to reconcile conflicting values, so it may come off weird. I would appreciate any input from any of my readers as I continue my thinking.]

There seems to be a fundamental contradiction in the discussion of justice for African Americans (and other minorities), at least in the way it's debated in the academy. On one hand, the anti-racist position took hold as a political value during the post-war Civil Rights movement, a so-called "liberal" perspective. On the other hand, discussions about multiculturalism have developed since then into a value of cultural difference itself, a so-called more "radical" approach (although it can be more or less radical, depending on the particular argument). In some strands of multicultural thinking, there is an almost fetishistic relationship to cultural and racial differences, such that the maintenance of cultural differences over time becomes an end-in-itself. Many if not most activists, intellectuals, researchers and politicians hold these two values, anti-racism and preservationist multiculturalism, simultaneously, without understanding their contradiction or connection.

The anti-racist position holds that 19th century essentialist anthropology and biology were precisely wrong when they argued that cultural differences and differences in character and ability were due to biological, bodily, or genetic factors among the "races." Indeed, the anti-racist position has been continually bolstered over the past 50 years by research that has demonstrated that, in a nutshell, there are no biological differences among human beings of so-called "different" races, and that things like skin color, nose shape, eye shape, and hair texture are no more meaningful for character or culture than hair color or freckles. (It is important to note, however, that human beings like all mamals do develop population-specific adaptations, so that for example there is a connection between the sickle-cell gene and immunity to malaria; but these are population-based adaptations and not racial.) So given that race as it has been conceived of since the 18th century does not, in fact, exist, anti-racism insists that social inequalities based on race must be eliminated. So far, so good.

Confusingly, because of the way human cultures and societies work, the fact that American culture was racist, that is, that American culture developed over time a racist ideology that historically structured our society, produced a particular material and social environment in which African Americans lived, namely slavery and segregation. Even under the best of circumstances, this division in American culture had its effect at the individual level (what W.E.B. du Bois called "double consciousness") and at the cultural level (what du Bois called the "Story, Song, and Spirit" of African American culture). In other words, around the country, African Americans dealt with their lived experiences by producing cultures to explain them and make them bearable. The net result of this history has been that a major part of U.S. culture is something that might be called "African American Cutlure."

However, like all cultures, this African American culture is diverse and multiple, as African Americans in different regions had different experiences and produced different cultures; African Americans themselves are individually varied like all Americans, by religion, class, gender, sexuality, ability, education, politics, etc. Too further complicate matters, African American cultures (and by analogy, European American) cultures are already hybrid, or culturally mixed, as African- and European American traditions collided and blended over time.

In the preservationist multicultural position, the maintenance of these distinctive African American cultures over time is the goal. On its face, this seems aesthetically and morally desireable. African Americans are part and parcel of America writ large, and their heritage and community and historical experience should be honored and valued by America at large. In a democracy, it likewise appears normal and right that any individual with the right to free association in a large democracy will create and maintain social group identities and will identify with others of like experience and background.

The problem comes when you try to put the anti-racist and the preservationist-multicultural values together. In order to preserve African American cultural identities, at least on the individual level, you have to be able to identify people as racially different. So in a preservationist-multicultural structure, the very terms of racism—the intelligibility of racial differences based on skin color, nose shape, and hair texture—must remain salient in order to preserve the culture in question; while the anti-racist value system seeks at the same time to reduce their salience of "racial differences" by creating ultimately a race-blind society with substantive (not just structural) equality. So the question becomes, can you work toward these two goals at the same time? Can you eliminate racism while at the same time insisting on the salience of race over time in order to preserve African American culture(s)? Can you preserve individuals' and groups' connections to their pasts and heritage while at the same time undermining the means (race distinctions) whereby those groups and identities were formed in the first place?

Based on the unequal distribution of social, cultural, political and economic goods in U.S. society—that is, based on the persistent inequality between people of African and people of European and Slavic descent in the U.S.—I can't help but lean to the anti-racist side. The social context wherein individuals can have histories, and identities and cultures within the larger democracy is an end-in-view of any effective democracy; but I think that substantive equality must take priority. Until African Americans are of a parity in issues of education, property ownership, and occupation, the "right" to have a particular identity is a hollow one. In other words, a mere preservationist appreciation of diversity is insufficient to close the substantive gap between white and black America. Real, substantive and economic reforms must be put into place.

To be continued…