jump to navigation

Neanderthal Females Kicked Ass 14 November 2007

Posted by Todd in Biology, Evolution, Gender.
Tags: , ,
comments closed

neanderthal.jpgApparently, a pair of anthropologists at University of Arizona are arguing that Neanderthal women were hunters alongside the males. The evidence is pretty sketchy, but I like the image. In any case, we know that even in our line, Homo sapiens females were far more involved in the day-to-day survival of their (relatively) egalitarian bands pre-history. The anthropologists say the archeological evidence still shows a gender division of labor, but that they think it’s clear that women were also hunters.

Now I’m fine with that idea of las chicas neanderthalas hunting mastodon. What raised an eyebrow, however, was the Boston Globe’s lead into the story, which was that this practice might be what led to the species’ extinction. The demographic issue is obvious: hunting is dangerous, and losing fertile females would be catastrophic to a species’ survival. The problem here is that the species lasted for nearly 150,000 years. A practice so detrimental to the population as losing fertile females to a random tusk here or a stomping furry foot there would have led to a much quicker demise of the population (which coexisted for thousands of years with Homo sapiens in Europe).

In order for that to have been a major factor in Homo sapiens neanderthalis extinction, it would’ve had to have been a recent behavioral innovation. Otherwise, it would’ve led to demographic collapse much earlier. To have lasted 150,000 years with that behavior would have to mean that Neanderthal females kicked some serious ass with a spear.

What Is a Scholar? 12 November 2007

Posted by Todd in Academia & Education, Teaching.
Tags: , , ,
comments closed

Reading in the October New Yorker magazine the brief intellectual biography of Jacques Barzun, historian and culture critic, I found myself wondering again, what exactly makes someone a scholar? Is a scholar a teacher or a researcher? Does a scholar retreat to their study to read and think, or is a scholar engaged on the street with the proverbial “people”? Is a scholar a call to a certain ethical kind of life? Indeed, is scholarship a calling at all? If you are a scholar, how should you be spending your days? Do other people have to recognize you as a scholar in order to call yourself a scholar, or is the title “scholar” not something anyone should ever say of themselves, at least not among those who would consider it gauche? Is scholarship pretentious or vital? Does having a Ph.D. make you a scholar or just a wanna be?

Barzun has spent his life reading, writing, thinking, and teaching (he turns 100 this year). Thirty-five books and countless articles later, a popular teacher at Columbia, recognized as an originator of “culture critique” in literary theory, and a champion of a humanistic kind of history (as opposed to the social scientific mode), Barzun seems to exemplify what a scholar is or should be. Some of my favorite thinkers were immensely productive in their first 40 years of life, and most were also teachers.

Perhaps the most glaring exception to this would be William James who spent the first 36 years of his life trying to figure out what he wanted to be when he grew up. Reading James’ biography this summer, I felt somewhat comforted, not simply because of my low productivity so far, but because he and I share so much in common in our temperaments.

But the academic system of the United States has made scholarship into a job, and I feel more or less like a factory worker: produce bachelors degrees at break-neck speed and in your down time, produce enough scholarship to get you through tenure. The increasing proletarianization of the academic labor force, the loss of funding for public universities since the 1970s, the transformation of the professoriate into a temporary, expendable workforce has created a publish or perish mentality, spoken of in bitter tones by academics even as they work to keep their jobs. Research has become instrumental (and in many cases slipshod), teaching completely rationalized and assessed (the fault, in my opinion, of “scholars” of education who justify their existence by creating nonsensical rubrics for judging outcomes)), and the life of the mind is now a giggle-inducing joke among graduate students and professors alike.

Somewhere around age 16, I decided I was going to get a Ph.D. — I had loved school for as long as I could remember and it seemed like a good way to keep on keepin’ on. I loved learning and exploring, read constantly, soaked up information wherever I could find it. My mission and foray into orthodox Mormonism simultaneously halted my “mindlife” and prepared me for new directions, questions, and insights. But college did as much to squelch my desire to learn as it did to stoke my curiosity. For some reason, much of my schoolwork came to feel like a chore instead of a joy (of course, that had been true in high school as well, but as a teenager I naively thought that was merely an aberration).

Graduate school introduced me to the realities of academic work, as opposed to the sheer joy of learning. There were moments when I wondered what the hell I was doing. Teaching, which I thought I would love, proved to be incredibly difficult. My 2nd year teaching Western Civilization, I had a class of students who had waited to take their requirement until they were seniors and so thought they knew the course material, and resisted every single thing i tried to teach or do in the classroom. I had serious doubts about becoming a teacher at that moment. Despite that roadblock I decided to continue in my quest to become a university professor.

As an undergraduate, I imagined my professorial life to be one of reading a lot, and having animated conversations with curious students. I imagined thinking important thoughts and writing them down. I imagined arguments and debates with other impassioned scholars.

Graduate school introduced me to the idea of the “public intellectual”, and I began to think that perhaps my research and thinking could actually help people, change the world. I knew that it was naive and idealistic, and yet I still secretly harbored the hope. I also found a great deal of satisfaction in teaching, and thought of it as opening minds and engaging with bright young people who inspired me.

Now two years into my tenure-track job, I find I’m in a place of disillusionment. It’s a weird place to be when I knew much of this coming into the position, having been told by mentors and friends what being a professor is “really” like. Yet actually living it is somehow more bracing and upsetting than merely being told about it. I teach in the largest public university in the United States, guided by a legal dictate from the 1960s to offer undergraduate education to (I think) the top 30% of all California’s high school graduates. The CSU has to fight for minimal funding every single year, but pays its administration at the system-level absurd amounts of money and benefits with no transparency (Arnold Schwartzenegger just vetoed the bill that would’ve made executive pay in the system transparent and accountable). Student teacher ratios continue to rise, as does the ratio of courses taught by temporary, part-time faculty vs. full-time, tenure-track faculty (if I’m not mistaken, a majority of our courses are now taught by temp workers). The preparation of students coming in is often well below what is necessary to succeed in a university. The teaching load, which was reasonable before the publish-or-perish model of tenure, is now crushing if you want anything resembling a research program, which ironically you must have in order to get tenure (specifically, 2x the teaching load of the UC system, significantly lower pay, but 1/3 of tenure is still based on research). The relative comfort of tenured professors who bought houses in the 80s when you could still buy a home on our salary combined with a tight and deeply conservative institution makes institutional inertia almost a given. Like professors all over the country, I’m disheartened by the depth of apathy in many of my students.

In short, my idealism about being a scholar has been crushed. I’m not saying this to whine or complain (although I’d really appreciate making a more liveable wage for the bay area). Rather I ask the question: Given the realities of higher education in the United States today, and my particular experience in my tenure-track job, what does it mean to be a scholar? Is scholarship the privelege of a select few at Columbia University? Or is it a thing of the past? Are there no places left where curious students and excited professors talk, argue, engage, and stimulate each others’ minds? Do you have to quit academia to be a scholar at this point? Is a university truly nothing more than a factory producing credentialed workers for the economy?

In many ways, these are personal questions. I have to figure out what, if any, of my old ideas of having a “life of the mind” (before the phrase made me scoff) I can salvage; in what ways might I still, in a more realistic sense, be a “public intellectual”? Given the expectations and values of my students, what should my expectations be of them and of my life as a teacher? Should I have a reasonable expectation of being able to research and write throughout my career, or is that something I need to modify? For someone who has felt that scholarship was his vocation since he was 16 years old, these are not trifling questions. John Dewey argued that a good philosophy is one that meets the world where it is. This can be painful and difficult if you still find yourself attached to the old philosophy, one built on values that still resonate in your core world view.

The Allure of Determinism 12 November 2007

Posted by Todd in Biology, Philosophy & Social Theory, Philosophy of Science, Postmodernity and Postmodernism, Queer Theory, Science, Social Sciences.
comments closed

Inspired by John Dewey’s declaration that Darwinism changed everything [see epigraph to the right], I’ve spent the past three years reading everything I can get my hands on about human evolution, with a healthy dose of cognitive science mixed in. My undergraduate education was firmly post-modern/post-structural, seeing all meaning as ephemeral and utterly situational: human life could only be explained by the wispy, evanescent strands of thought they attached to it. Graduate school introduced me to a more social scientist mode of saying roughly the same thing: human life, or rather, the meaning of human life is socially constructed. In both educational experiences, the dominant view of human nature was that it did not exist, because anything you could say about it would be, of course, socially constructed or pure culture.

My mind already primed from Dewey’s form of philosophical naturalism, I delved into the natural history of humankind, opening my eyes to some basic, yet key insights. Humans share not only a history but an evolutionary past. Human bodies are not the mere objects of our capricious and malleable cultures, but are indeed the source of our cultural capacity. I had to learn stochastic thinking in a new way to see how generalizations could still be made despite the rather overwhelming diversity of humans.

But now I grow weary of the evolutionary scholarship, more particularly the evolutionary psychology. Just as the cultural determinism of my education had grown thin in its effort to eschew our bodies, so has my impatience grown with the “just so” explanations coming from some of the more prominent researchers in biological anthropology, human evolution, population genetics, and most egregiously evolutionary psych.

I find that I’m reacting against determinism on both sides of this theoretical fence. Determinism seems, to be honest, lazy. It seeks easy explanations for human behavior. And in fact it produces sometimes rather aesthetically pleasing results, often sublime in their simplicity even when dead wrong. When you take something like homosexuality (which I’ve discussed at length here) and you have to tease out the interplay of evolution, hormones, genitals, fetal development, cognition, sensing and problem-solving brains, child rearing, cultural mores, social pressures, pop culture, institutions, the results are messy and contingent. You must rely on probability to determine the interplay of multiple possible causalities and you have to hold in your mind the relationship between individual cases and overarching trends, commonalities, and generalizations.

This is nothing I haven’t written here before, nothing new. I can’t help but wonder when the biological and the social/cultural will finally merge and start working together to deepen our understanding of what it means to be human.