jump to navigation

The Allure of Determinism 12 November 2007

Posted by Todd in Biology, Philosophy & Social Theory, Philosophy of Science, Postmodernity and Postmodernism, Queer Theory, Science, Social Sciences.
comments closed

Inspired by John Dewey’s declaration that Darwinism changed everything [see epigraph to the right], I’ve spent the past three years reading everything I can get my hands on about human evolution, with a healthy dose of cognitive science mixed in. My undergraduate education was firmly post-modern/post-structural, seeing all meaning as ephemeral and utterly situational: human life could only be explained by the wispy, evanescent strands of thought they attached to it. Graduate school introduced me to a more social scientist mode of saying roughly the same thing: human life, or rather, the meaning of human life is socially constructed. In both educational experiences, the dominant view of human nature was that it did not exist, because anything you could say about it would be, of course, socially constructed or pure culture.

My mind already primed from Dewey’s form of philosophical naturalism, I delved into the natural history of humankind, opening my eyes to some basic, yet key insights. Humans share not only a history but an evolutionary past. Human bodies are not the mere objects of our capricious and malleable cultures, but are indeed the source of our cultural capacity. I had to learn stochastic thinking in a new way to see how generalizations could still be made despite the rather overwhelming diversity of humans.

But now I grow weary of the evolutionary scholarship, more particularly the evolutionary psychology. Just as the cultural determinism of my education had grown thin in its effort to eschew our bodies, so has my impatience grown with the “just so” explanations coming from some of the more prominent researchers in biological anthropology, human evolution, population genetics, and most egregiously evolutionary psych.

I find that I’m reacting against determinism on both sides of this theoretical fence. Determinism seems, to be honest, lazy. It seeks easy explanations for human behavior. And in fact it produces sometimes rather aesthetically pleasing results, often sublime in their simplicity even when dead wrong. When you take something like homosexuality (which I’ve discussed at length here) and you have to tease out the interplay of evolution, hormones, genitals, fetal development, cognition, sensing and problem-solving brains, child rearing, cultural mores, social pressures, pop culture, institutions, the results are messy and contingent. You must rely on probability to determine the interplay of multiple possible causalities and you have to hold in your mind the relationship between individual cases and overarching trends, commonalities, and generalizations.

This is nothing I haven’t written here before, nothing new. I can’t help but wonder when the biological and the social/cultural will finally merge and start working together to deepen our understanding of what it means to be human.


Scientific Mindset 1 July 2007

Posted by Todd in American Pragmatism, Democratic Theory, Philosophy & Social Theory, Philosophy of Science, Reviews, Science.
comments closed

In reading Tim Adams’ review of Natalie Angier’s new book, The Canon: A Whirlygig Tour of the Beautiful Basics of Science, I came across this gem:

‘Science is rather a state of mind,’ Angier argues and, as such, it should inform everything. ‘It is a way of viewing the world, of facing reality square on but taking nothing for granted.’ It would be hard to argue that this state of mind was advancing across the globe. We no longer make and mend, so we no longer know how anything works.

This reminded me of one of my favorite quotes by John Dewey, from Freedom and Culture. Dewey had begun to integrate scientific thinking into his philosophy shortly after reading Darwin and coming to grips for the first time with evolution (see the Dewey quote to the right); eventually this led him to parallel thinking with William James and Charles Pierce on the nature of truth and the origins of human knowledge; and finally to a philosophical adaptation of G.H. Mead’s social behaviorism. In Freedom and Culture, Dewey is trying to convince readers that given the way our brains work, and given what the scientific method has taught us about how to gain reliable knowledge upon which to base social decisions, we should adopt a more generalized “scientific mindset.”

“This interest [in scientific inquiry] has developed a morale having its own distinctive features. Some of its obvious elements are [1] willingness to hold belief in suspense, ability to doubt until evidence is obtained; [2] willingness to go where evidence points instead of putting first a personally preferred conclusion; [3] ability to hold ideas in solution and use them as hypotheses to be tested instead of as dogmas to be asserted; and [4] (possibly the most distinctive of all) enjoyment of new fields for inquiry and of new problems.”

O Say, What Is Truth? 28 March 2007

Posted by Todd in American Pragmatism, Cognitive Science, Evolution, Philosophy of Science, Postmodernity and Postmodernism.
comments closed

[Posted this on FLAK earlier today, and thought I’d cross-post it here.]

I find the American Pragmatists’ definition of truth to be the most helpful (esp., Charles Pierce, William James, and John Dewey). They were able to combine the idea that there are objective facts independent of human perception (i.e., that truth isn’t located in perception) with the idea that human perceptions of those facts changes over time (i.e., that human knowledge arises from changing experiences in their environments). They argued that, in terms of human knowledge, truth is a process and is functional. This isn’t a kind of postmodern relativism (although they were relativisitic in the narrow sense), but rather the admission that human knowledge is always incomplete. First, truth is a process because it arises in its environment in human experience, rather than existing as a Thing-in-itself. Second, it is functional, because human being know truth based on whether it “works” in their environmental experience.

P, J & D argued that science is simply a formalization and refinement of the natural way that human brains gather knowledge from and about their environments: through experiencing them and thinking about their experiences. Science merely takes that natural, biological process and makes it rigorous. But science also only works because it has built into it the notion that new experiences may bring new knowledge tomorrow.

Dewey took this a step further to argue that whereas human history is about the Quest for Certainty (i.e., humans seek to understand perfectly their environments in order to control it (a theme which has since been picked up by cognitive science and most interestingly by evolutionary psychologists)), and that philosophy & science have been about achieving Certainty. Dewey argued that, since we now understand how human brains work (he was drawing this conclusion in the 1930s, when cognitive psychology was still relatively new), and since we know that environments constantly change (which he took from Darwinism) and that our brains thereby constantly adapt to those changes, that in formal searches for truth (i.e., scientific and philosophical), we must jetisson the Quest for Certainty and embrace the fact that knowledge is always Uncertain already.

What is Known at any given time by any given group or individual is Known precisely because it Works in the environment at hand (i.e., truth as function). But that Known will constantly change as the organism (human individual or group) moves through time and the experience changes (i.e., truth as process). But the objective truth which exists independent of human perception is also knowable, if only Uncertainly and impartially, through the processes and methods of science (small-s), which are to be open to experience, hold all ideas in solution to replace them when knew information demands it, and to actively seek to understand it without ever believing you have achieved Ultimate Truth. Truth is dead the instant you think you have it and that there is nothing more that can be said; truth only works, or rather it only works Correctly, when it is understood as a Process.

Knowledge: Faith & Reason 21 December 2006

Posted by Todd in Academia & Education, Democratic Theory, Ethics, Law/Courts, Philosophy of Science, Religion, Science.
comments closed

My alma mater (soul mother?) University of Kansas, Hall Center for the Humanities hosted a series of lectures this fall in their Difficult Dialogues series, including an amazing range of people talking about the role of religion and science in public life.  They are in RealMedia format, and so require RealOne Player. They are worth the time. I’m about 1/2 way through Judge Jones’ talk (the judge who presided over the Dover intelligent design case of last year), and will comment later.  Here’s a link to the Hall Center’s web site, and the links to all the talks.

KU: Hall Center for the Humanities

Thinking about Naturalism and Social Theory 20 December 2006

Posted by Todd in American Pragmatism, Cognitive Science, Cultural Sociology & Anthropology, Evolution, Philosophy & Social Theory, Philosophy of Science, Sexuality.
comments closed

[I’ve been trying to think through, again, how I think evolutionary theory and cognitive science could inform a more powerful and accurate social theory. This is from a conversation I’ve been having on the ASA’s Evolutionary Sociology listserve.]

I’m borrowing the word “naturalistic” and “naturalism” from the philosophy of science. It’s a particular orientation that, in a nutshell, insists that humanistic knowledge must align with and be supported by the current state of scientific research. I’m somewhat of a John Dewey specialist, one of the originators of that line of thinking, that human beings *are* animals, subject to evolution, and that science can know things about us that philosophy (humanities) cannot.  Dewey’s social theory (most fully developed after 1924; see esp. Nature and Experience) most important relies on the assumptions of what today we would call cognitive science, with the assumption that human cognition *evolved*.

In my own work (I’m a cultural sociologist, for lack of a better word), I apply these assumptions in my analyses and am currently working on an article-length piece that will propose a naturalistic social theory of culture, relying heavily on evolutionary psych/cognitive science.

My own orientation to these issues is that social sciences (esp. cultural anthro, poly sci, history, and most forms of sociology) ignore the findings of other sciences, especially cognitive science and paleoanthropology. I find that the retreat to “constructivism” is often facile and without careful thinking or understanding about how phenotypes come to be (for example) or the interaction of human cognitive processes and meaning formation (for another example).  BUT, having read a lot of sociobiology, I think that there is still big problems with a lot of sociobiology, which likewise tends not to account for the power of human cognition to transform human environments (both social and physical). In other words, observing the behavior of a marmot isn’t the same as observing human behavior, because human brain evolution actually enables us to create meaning systems (and concomitant practices) that are maladaptive and/or out of touch with reality. More simply put, human cognition (and by extention, culture) allows human beings to act in ways that do not match their “nature” and which are in fact biologically maladapted.  Further, sociology and anthropology and history have done a lot of work over the past 200 years trying to figure out how meanings (symbols, practices) move through time and work to shape interaction and social structure. I firmly believe that much of their findings are still valid, but need to be revised by accounting for what we know from the biological sciences. And sociobiologists need to take that 200 years of work seriously as well, and see that much of the understandings of social science are actually quite necessary in explaining human social-biology.

I do not believe with constructivists that perception is completely socially constructed; nor do I believe with the cruder forms of sociobiology that it is purely biological (genetic, brain morphology, etc.).  I think some of the most interesting thinking along these line is being done by geneticists who are trying to work out the complicated dance between the gene and the environment in producing a phenotype.

Likewise, I think that a naturalistic sociology would work to describe (and maybe explain?) the complicated dance between the genetic, hormonal, embodied human, and it’s social environment and meaning systems (i.e., cultures), including who the social environment can shape phenotypic expression; and how the genotype actually limits the power that a social environment can have and also limits what kinds of social and cultural arrangements would be adaptive (or at worst, evolutionarily neutral).

Full disclosure: Much of my feelings along these lines (and perhaps my own personal narrative which led me to explore this area) come from the fact that I’m gay. The social constructivist explanation of homosexuality makes absolutely no sense to me, when it’s carried beyond the obvious, that different societies in different times and places make sense of sexual desires differently. But to argue that the desire itself is social in origin borders on the absurd. I think that homosexuality is a good illustration of how the biological limits

How to Spot Bad Science a Mile Away 22 August 2006

Posted by Todd in Commentary, Philosophy of Science, Science, Teaching.
comments closed

The Chronicle of Higher Education published a brief article in 2003 describing a rubric for determining the legitimacy of scientific claims made in public. As I read through the seven points by Dr. Parker (a physics professor), my students kept coming to mind, as they resist everything I give them. Admittedly, social science has a different dynamic, given that because of its very nature, everyone thinks they are automatically experts—you know, like, they live in society already. This semester, I’ll be teaching quite a bit of evolution and cognitive science in a course on how different cultures developed their particular views of and relationship to the ecosystems in which they live (a.k.a., nature; a.k.a., the environment), so I am braced for irritating evolution conversations and am considering using this list to discuss the issues of science in culture tomorrow on the first day of class.

1. The discoverer pitches the claim directly to the media. The integrity of science rests on the willingness of scientists to expose new ideas and findings to the scrutiny of other scientists. Thus, scientists expect their colleagues to reveal new findings to them initially. An attempt to bypass peer review by taking a new result directly to the media, and thence to the public, suggests that the work is unlikely to stand up to close examination by other scientists.

One notorious example is the claim made in 1989 by two chemists from the University of Utah, B. Stanley Pons and Martin Fleischmann, that they had discovered cold fusion — a way to produce nuclear fusion without expensive equipment. Scientists did not learn of the claim until they read reports of a news conference. Moreover, the announcement dealt largely with the economic potential of the discovery and was devoid of the sort of details that might have enabled other scientists to judge the strength of the claim or to repeat the experiment. (Ian Wilmut’s announcement that he had successfully cloned a sheep was just as public as Pons and Fleischmann’s claim, but in the case of cloning, abundant scientific details allowed scientists to judge the work’s validity.)

Some scientific claims avoid even the scrutiny of reporters by appearing in paid commercial advertisements. A health-food company marketed a dietary supplement called Vitamin O in full-page newspaper ads. Vitamin O turned out to be ordinary saltwater.

2. The discoverer says that a powerful establishment is trying to suppress his or her work. The idea is that the establishment will presumably stop at nothing to suppress discoveries that might shift the balance of wealth and power in society. Often, the discoverer describes mainstream science as part of a larger conspiracy that includes industry and government. Claims that the oil companies are frustrating the invention of an automobile that runs on water, for instance, are a sure sign that the idea of such a car is baloney. In the case of cold fusion, Pons and Fleischmann blamed their cold reception on physicists who were protecting their own research in hot fusion.

3. The scientific effect involved is always at the very limit of detection. Alas, there is never a clear photograph of a flying saucer, or the Loch Ness monster. All scientific measurements must contend with some level of background noise or statistical fluctuation. But if the signal-to-noise ratio cannot be improved, even in principle, the effect is probably not real and the work is not science.

Thousands of published papers in para-psychology, for example, claim to report verified instances of telepathy, psychokinesis, or precognition. But those effects show up only in tortured analyses of statistics. The researchers can find no way to boost the signal, which suggests that it isn’t really there.

4. Evidence for a discovery is anecdotal. If modern science has learned anything in the past century, it is to distrust anecdotal evidence. Because anecdotes have a very strong emotional impact, they serve to keep superstitious beliefs alive in an age of science. The most important discovery of modern medicine is not vaccines or antibiotics, it is the randomized double-blind test, by means of which we know what works and what doesn’t. Contrary to the saying, “data” is not the plural of “anecdote.”

5. The discoverer says a belief is credible because it has endured for centuries. There is a persistent myth that hundreds or even thousands of years ago, long before anyone knew that blood circulates throughout the body, or that germs cause disease, our ancestors possessed miraculous remedies that modern science cannot understand. Much of what is termed “alternative medicine” is part of that myth.

Ancient folk wisdom, rediscovered or repackaged, is unlikely to match the output of modern scientific laboratories.

6. The discoverer has worked in isolation. The image of a lone genius who struggles in secrecy in an attic laboratory and ends up making a revolutionary breakthrough is a staple of Hollywood’s science-fiction films, but it is hard to find examples in real life. Scientific breakthroughs nowadays are almost always syntheses of the work of many scientists.

7. The discoverer must propose new laws of nature to explain an observation. A new law of nature, invoked to explain some extraordinary result, must not conflict with what is already known. If we must change existing laws of nature or propose new laws to account for an observation, it is almost certainly wrong.

My favorites are Nos. 1, 2 and 5.

The Root of All Evil?: Part 1—The God Delusion (review) 1 July 2006

Posted by Todd in Christianity, Cultural Critique, Documentary Film, Islam, Judaism, Philosophy of Science, Political Commentary, Religion, Reviews, Science.
comments closed


[My review of Part 2—The Virus of Faith can be found here.]

There has been much ado about Richard Dawkins’ Channel 4 two-part documentary, The Root of All Evil?, mainly because of Dawkins’ almost strident atheism and because of the relatively inflamatory title. [The video is not yet available in North America, but both parts are currently downloadable from Google Video, Part 1 here and Part 2 here.] Having been raised in a pretty orthodox Mormon household and having family on both sides who are quite religious now, I tend to be less afraid of religiosity in general than Dawkins seems to be. And I do sympathize with the religious impulse, the desire to beleive in something greater, for an explanation of both the uncertainty and fickleness of life as well as the disappointment with the realities of our existence.

When I realized I no longer believed in God, I found myself with twin wounds, one left by the loss of community, the other by the loss of submission to something greater. Dawkins seems to miss these dynamics completely, the importance of communal bonds and identity formation in people’s desire for and attachment to their religious beliefs. On Bill Moyers’ new series, On Faith & Reason, Collin McGinn said that when he left faith behind he found the world without God to be so much more vibrant and rich than it ever was with God. Although I did also eventually arrive at that conclusion, the years it took me to separate myself from religion were painful and transformed my most basic world view. The difficulty in replacing one’s world view and/or accepting the full implications of rationality and science can be quite overwhelming, but the documentary presents Reason as an easy englightenment, to which folks should easily convert.

So the main problem I had with the documentary emerges from my personal experience combined with my training as a sociologist: Dawkins doesn’t seem to fully understand how and why religion has the power it does on people, the role that it actually plays in people’s lives to give them meaning. All he seems to be able to see is its irrationality and anti-scientific mindset, along with the horrifying moral consequences of such belief. I had no qualms or disagreement with Dawkins on these points, but the documentary seemed to set up two categories of religion and science without addressing the complexities of why people believe in the first place and why it can be so hard for an individual, emotionally, socially and psychologically, to leave a faith-community. An exploration of these dynamics can help us understand more deeply why people refuse the evidences of science and rational argument; and more importantly it could help us understand to have more productive dialogues with the faithful, something of utmost importance if we are going to save our democracies around the world from collapsing into theocracies.

Another quibble I had was that the documentary painted religion with such a big brush that suicide bombers and rabid fundamentalists are lumped in with the millions of religious who fight injustice, hunger, and violence world wide. Human religions are vastly diverse and have multiple and contradictory consequences in the real world. It is problematic to ignore these deeply moral aspects to many of the world’s religious. I don’t point this out as an apology for religion, but rather to insist on seeing religion as a form of culture in all its complexity. Dawkins’ points about rationality and science stand even in the face of the morally positive aspects of religion.
[Dawkins has responded to many aspects of these and other criticisms in The New Statesman and in a great interview with the Infidel Guy.]

In all other aspects, I found the documentary to be a solid explanation of why scientific thinking and rational thought should prevail over religious belief, especially in the public sphere. Dawkins’ discussions with the likes of Ted Haggard illustrate clearly the problems of having rational discourse with some kinds of faithful. Haggard refuses the most basic premises of rational thinking and evidentiation of argument and insists, in an odd religious postmodern twist, that all ideas are of equal value and should be given equal time. He even goes so far as to accuse Dawkins of arrogance for making scientific assertions. In another interview on Point of Inquiry, Dawkins points out the arrogance is actually making assertions for which you have no evidence whatsoever and expecting that no one will criticize your position.

As I’ve been musing lately about the merits of rationality and especially about my own work in social theory and method, I find myself frustrated by the simple fact that many people simply, willfully refuse to accept the basic mode of rational thinking. McGinn pointed out that both the academic left and the religious right have been assailing rational thought in an odd sort of allegiance for the past 30 years, where on one hand postmodern philosophy and on the other fundamentalism make similar claims that require belief without evidence and refuse the most basic of rules of logic and empirical reasoning. It may simply be that it is impossible to have that discussion where those premises are not shared. For the academic left, perhaps more empirically and rationally minded researchers can work harder to actively engage in advocating the methods of rational inquiry; and perhaps for the religious right, the best we can do is continue unceasingly to fight for the fundamental principles of democracy that would allow them their religiosity without infringing on social progress. One debate, on the left, is ongoing and will probably work itself as postmodernism continues to lose its caché outside of the humanities; but with Dawkins, I do fear the power of the fundamentalist mind whose morality is clear and justifies violence and coercion to remake society in his or her image.

Popper, Falsification, and Social Theory 7 June 2006

Posted by Todd in Philosophy & Social Theory, Philosophy of Science, Queer Theory.
comments closed

Theory and Reality is blowing my mind and making me stop and think about things I've taken for granted since high school. I've always been taught that scientific theories can never be proven true, but only be proven false. This is the theory of falsification that comes from Karl Popper (from Chapter 4). Popper's theory of science break down into two parts, both of which have had an incredible impact on the way we think about and teach scientific inquiry over the past 50 years.

1. Falsification (see above). For Popper, there can be no induction (that is, the quest to infer general principles from observation) because confirmation is an impossibility. Only falsification is possible.

2. Demarcation. Popper wants to distinguish between scientific and pseudo-scientific theories. For Popper, this comes down to a rather simple-seeming idea that scientific theories are those wherein scientists took risks by stepping outside of the known observations to try out new “conjectures.”

Only in reading this book have I come to confront the deep problems of scientific confirmation, about how we can actually know when a theory has been confirmed (because science presumes a constant degree of uncertainty, philosophers of science avoid the word “prove”). But I had always assumed the theory of falsification. Godfrey-Smith shows that most scientists use Popperian language and theories of science without understanding what he was actually arguing. Following G-S's argument, here are the major problems with Popper's two points:

1. The problem with falsification is the problem with scientific method in general. It's called the problem of “holism in testing.” Simply put, in order to falsify (or confirm) a theory, we must make assumptions about our ability to observe, quantify and analyze data. If any of those assumptions are faulty, our falsification (or confirmation) is false. Popper got around this by arguing that science was a set of behaviors that included making decisions about how and when to observe and what data counts. The problem with Popper's answer is obvious: If it boils down to “decisions”, then anyone can accept or falsify any theory based on which decisions they make about observation and data. If that is the case, there is not only no confirmation, but there is no falsification either. Hopefully it is obvious why this is an unacceptable theory of science. Popper then argued that you could choose to accept a theory if it was improbable that it would ever be falsified. But again we are left with the question of how improbable does it have to be before it's no longer falsified? Instead of his original claim, that observation, tightly linked to theories, have the power to falsify said theories, Popper ends up saying that falsification can occur without observation, because observation is probabilistic and based on decisions.

The problem really boils down to Popper's rejection of confirmation (G-S calls philosophers of this ilk induction skeptics). G-S illustrates the problem with a problem: If you have to build a bridge that has to carry load X, and you have two theories of how to build such a bridge, Bridge A has been tested for 50 years and never been falsified and another Bridge B is brand new and has never been falsified, Popper gives us no way to choose between the two. If Popper is correct and there is no way to confirm a theory, then two theories that have never been falsified are of equal value. Intuitively, the problem of the two bridges probably seems obvious to most people–you obviously choose the bridge that has been tested for 50 years. But remember, for Popper, Bridges A & B are theoretically equal, because there is no such thing as confirmation. Popper worked most of his later life trying to theorize an idea of corroboration, but never succeeded in solving this problem.

In the end, induction skepticism is as problematic as a naive belief in confirmation and induction proper. And so we are left with the problem of how to conceive of scientific confirmation, a question I'm sure Godfrey-Smith will address more as the book progresses.

2. So the scientific method most of us were taught is actually a kind of combination of Popper's method of demarcation (scientific theories are those which take risks) and a theory of confirmation (which Popper rejects). This boils down to the basic formula:

hypothesis/conjecture + observation/experimentation = theory/confirmation

Godfrey-Smith modifies Popper's demarcation theory to say that risk-taking is not about the production of the theory but about the way the theory is handled. G-S argues that Popper was onto something when he claimed that a scientific theory must be one that is set up for falsification, that it can risk observation. Popper argued that many theories can appear to have lots of possibility of observational falsification, but in actuality don't because they are never exposed to the risk of falsification. Specifically, Popper uses Marxism and Freudianism as examples of unscientific theories that are/were not produced in such a way as to be subject to falsification. G-S argues instead that the demarcation between scientific and non-scientific theory arises in the way those theories are handled. He gives examples of how Freudian theory can be handled as an a priori philosophy not subject to observation and falsification (e.g., in literary theory) or how it can be handled scientifically and open to observation and falsification (e.g., in cognitive psychology).

As I've been working on revisions of my book manuscript, I've been grappling with a comment one of the peer-reviewers made, that I needed to “use” more queer theory. I was educated early in my graduate career in queer theory and read it avidly until about 6 years ago, when I began to loose interest because it didn't seem to correspond to the people I was researching. What I've come away from in reading G-S this morning is a question about how to treat or handle queer theory scientifically and how to frame it for my book. If G-S's critique of Popper and Popper's critique of social scientific theories are correct, then the fundamental question I should be asking is this:

What observations would have to be made to falsify queer theory? Or given the observations already made by scientists and by me in my social-historical research, what aspects of queer theory have already been falsified?

I think in all honesty my research falsifies many parts of queer theory's basic premises of the social role of homosexuality, and I think the biology of homosexuality as it has developed over the past 15 years basically wipes out most strands of queer theory. And yet I'm in a discipline that takes it seriously and so I have to address it somehow (much like literary theory still takes Freud seriously).

Social Theory as Explanatory Inference 1 June 2006

Posted by Todd in Philosophy & Social Theory, Philosophy of Science, Queer Theory.
comments closed

In “hard” sciences, hypotheses precede experimentation and observation, and theories emerge at the end when a hypothesis has been confirmed. Although even scientists often say theory when they mean hypothesis (day-to-day American English uses theory in that way), the more technical way to understand theory is as an idea that has been confirmed through data. In the Western philosophical tradition, there has been an ongoing search for “universal principles”, called for example Eternal Law (Aquinas) or Natural Law (Enlightenment philosophers). Much of science for the past 400 years has been aimed at finding these universal “laws” that define and explain the natural world. This inclination to seek out and infer general principles has become problematic in the social sciences, where theory has become in many ways a hindrance to effective and useful research.

In the social sciences, the idea of theory has created three important problems. First, following both from the 18th century philosophy of science and from German rationalism, early social theorists assumed that they were finding and describing universal laws of human social interaction and human culture. Today, even after the modern, postmodern and anti-foundationalist critiques of universality in human social and cultural life, social theorists have continued, often unthinkingly, to make universal claims. (Ironically, postmodern theorists and their adherents are often as guilty of this as others). Second, social theory and cultural theory (especially in cultural studies) have become ends-in-themselves. Both professionally and intellectually, reading and understanding theorists and then employing them in one's research has become a de facto marker of intelligence (often referred to as “smart” or “getting it”) and sophistication. And so third, theory has come to function as effective hypotheses, as axiomatic assumptions that precede research. So much of sociological, anthropological and, most egregiously, cultural studies research reads as proofs of a preceding theory.

This is a problem of method derived from a faulty understanding of what theory actually is. In “Theory and Reality : An Introduction to the Philosophy of Science”, Peter Godfrey-Smith lays out three different levels of inductive reasoning in the production of scientific theories: 1) induction proper, which infers general principles from available data; 2) explanatory inference, which infers satisfactory explanations of available data, refusing the temptation to make generalizations; and 3) projection, which infers from present knowledge future outcomes. Godfrey-Smith argues that even in the hard sciences, it is most often actually impossible to confirm pure induction and that most scientific thinking should actually lead to explanatory inference.

John Dewey argued that scientific method was merely a formalized and institutionalized version of the natural way that human beings think. That is, based on prior experience, humans project into the future expected outcomes (hypotheses) and act accordingly (experiment); depending on the consequences of their actions (measuring outcomes), the original assumption (hypothesis) is either confirmed or contradicted, producing new concepts based in experience (theory).(1) When the assumption is contradicted, humans act in one of three primary ways: they act to change their environment so that the expected outcomes actually occur; they act to change their minds such that their assumptions are different; or they deny their experience and continue as before, regardless of outcomes.

For Dewey, part of formalizing this natural cognitive process into the scientific method is a particular understanding of theory, where theories are always instrumental–that is, theories are always a conceptual (cognitive) framework to understand lived experience, or the world as experienced by the theorist.

To overcome the methodological problems derived from the way theory is “used” in the social sciences and cultural studies, Dewey's and Godfrey-Smith's analysis of theory can point to a meta-theory of social and cultural research. Given the incredible contingency and multiplicity of variables in cultural and social interactions, social theory must in nearly all cases not be taken as a general principle. Researchers will have been educated in the social theories of others, and these can serve as frameworks and jumping off points for research, but preceding theories should be exploded and rendered transparent in approaching research, laid aside to interpret the data at hand, and in most cases should not serve themselves as hypotheses to be proved. Theory should never determine beforehand what evidence sought or considered nor conclusions drawn. Rather, theory should emerge after the research, through explanatory inference, and it must correspond to the data gathered.

(1) Dewey uses the word theory interchangeably with hypothesis, because he sees theories as being in a kind of cycle, where a beginning theory is constantly tested through experience and revised based on outcomes. This instrumental understanding of theory insists that theories are always in process and never finished, because they are always being adjusted based on experience.

[posted with ecto]

On Knowing (More on Religion vs. Science) 25 March 2006

Posted by Todd in American Pragmatism, Cognitive Science, Philosophy of Science, Religion, Science.
comments closed

Among my most ardent interests is the study of how human beings know. I had never thought too much about the question until I began studying John Dewey in 1998. For Dewey, human knowledge is necessarily embodied and experiential. He called it the organism-environment model, where the embodied individual knows only in transaction with its environment, and where for humans, environment is broadly construed to include the social (other humans) and cultural (symbolic, meaningful, and linguistic) elements of experience.

The traditional philosophical epistemology was based in what Dewey called the "Specator Model" of knowledge, where philosophers (think: Plato) knew stuff from the "outside" as a disembodied spectator. Dewey is making a key distinction between objective reality (which exists independent of human experience) and our knowledge of reality (which is necessarily experiential). In the spectator model, the reality is knowable unmediated by our bodies and experiences, thereby lending authority to the claims of philosophers who "know" it, and setting up a foil against which unreal, false, and situated knowledge could be compared. To the contrary, Dewey (and William James, and Charles Pierce, and George Herbert Mead) argued that whether you are a philosopher, a scientist, an engineer, a farmer, a hunter-gatherer, or a housewife, you knowledge comes from the same place: in a transaction with your environment—it is always mediated through experience.[1]

James called this "pure experience," where the "vital flux of life" becomes the very raw stuff with which and about which we think; and the process of thinking about it produces any number of "objects" that we create to make sense of and to manipulate and change our environment.

Recently, my good friend Matt, in responding to my "Evolution of God" post of last weekend, raised important issues in the experience of knowing:

In other words, the things that we know (in this case mathematics) do not arise out of an embodied experience but exist independent of the human mind. 1+1=2 even if humans don't exist and even if it can't be proven without some fundamental intuitions about the nature of 1.

For me, there are actually two separate issues here. First, is their an objective reality outside of human experience? And second, how do human beings know that reality? On the first issue, this is a variation on the oldie but goodie, "If a tree falls in the forest and no one is around to hear, does it make a noise?" Is there an objective reality of the tree falling and of the air molecules compressing and spreading outward from the event in such a way as to create noise, or does noise only exist because we hear it? It is vital to understand that there are objective realities out there which are not directly experienced by any human observer, which nonetheless exist and are real. Ironically, our own experience tells us this is so. We can come upon the fallen log 100 years later and observe the consequences of its fall. The tree fell independent of human mind.

The second issue, however, is for me the interesting question. It isn't whether or not the tree actually made noise, but how we would know one way or the other. Or to use Matt's example, the interesting question isn't whether or not 1+1 exists outside of human experience, but rather, how human beings come to know that 1+1=2 and that it exists independent of our experience.

It is the process of knowing that is of issue. What is key here in the science and religion "controversy" is that human knowledge comes from the same place, regardless or whether you're trying to figure out how to build a $1.3 billion bridge from Oakland to Yerba Buena Island, or trying to figure out how to teach your kids to look both ways before crossing the street, or trying to figure out how mollusks breed, or trying to understand the emptiness you feel when you're home alone. From philosophy to theology to biology to parenting to engineering to athletics to auto mechanics, human knowing is necessarily embodied and experiential.

I've been reading an article on the development of American Pragmatism and its transformation into Neo-Pragmatism in the past couple of decades, and I stumbled upon this explanation of Charles Pierce's definition of belief, and I think it gives a nice summary of my points above:

[Pierce] construes belief as involving habits of action, and doubt as the unsettled state resulting from the interruption of a belief-habit by recalcitrance on the part of experience. This real, living doubt, unlike Cartesian paper doubt, is both involuntary and disagreeable. The primitive basis of the most sophisticated human cognitive activity is a homeostatic process in which the organism strives to return to equilibrium, a process beginning with doubt and ending when a new habit, a revised belief, is reached.

Peirce compares four methods for the “fixation of belief.” The method of tenacity is simply to cling obstinately to whatever beliefs you have, avoiding any evidence that might unsettle them. The method of authority is to have a church or state impose conformity in belief. The a priori method — the method traditionally favored by metaphysicians — seeks to settle belief by inquiring what is “agreeable to reason.” But the most sophisticated inquirers, aspiring to indefeasibly settled belief, will always be motivated to further inquiry, never fully satisfied with what they presently incline to think. So the culmination of that primitive homeostatic process is the scientific method — the only method, Peirce argues, that would eventually result in beliefs which are indefeasibly stable, permanently safe from recalcitrance.

If you really want to learn the truth, you must acknowledge that you don’t satisfactorily know already; so the scientific inquirer is a “contrite fallibilist” (CP 1.14) ready to “drop the whole cartload of his beliefs, the moment experience is against them” (CP 1.55). In the Critical Common-sensism of Peirce’s mature philosophy — an attempted synthesis, as the phrase suggests, of Kant’s and Reid’s responses to Hume — the scientific inquirer is seen as submitting the instinctive beliefs of common sense to criticism, refinement, and revision.[2]

What the pragmatists are arguing isn't that we give up theological or philosophical debates, but rather that we move past the old and untrue epistemologies and ontologies that make us think we are pronouncing ultimate truth. In our philosophical and theological and literary and artistic debates, and more importantly, in our values and moral debates, we must acknowledge where our knowledge comes from and approach our meaningful questions—who am I, why am I here, what is the meaning of life—beginning with a "scientific mindset" (as Dewey called it) or a "scientific attitude" (as Pierce called it). This doesn't mean that we cede all knowledge production to scientists, rather it means that we approach our quests for knowledge with the understanding of the limits of our knowledge and how we get it; that when we make existential or moral or theological claims, they be made by giving reasons and arguments and evidence; that they be explicitly anchored in our experience in this world rather than in our efforts to create certainty and stability by generating knowledge that doesn't correspond to our experiences.

The human capacity to imagine and to think, to begin with experience, create thought-objects, and then imagine all their possibilities—our ability to see thought-objects as infinite possible means—allows us to imagine ourselves into cultural structures that are maladaptive, when we rely on our tenacity, on authority, or an a priori knowledge without accounting for our experience. At best, such disconnected knowledge-systems (cultures) can be merely odd; at worst they can be immoral and even violent. In the pluralistic world we live in today, nothing could be more dangerous.

[1] This is a similar mistake to that made by much of postmodernism and poststructural theory. Where both pragmatic and postmodern theories of knowledge are anti-foundationalist (i.e., deconstructed)—that is, there is no correspondance between signified and signifier—the postmodernists end up making the same mistakes as Plato and most Western philosophers. They deal with knowledge as if it exists outside of lived experiences. Derrida and the social theorists who rely on him, such as Judith Butler, all explain the change of knowledge as originating in the non-correspondance of the signifier. And in sociology, the cultural sociologists make a like mistake, assuming that 'culture' exists independent of human bodies and experiences. The pragmatists, which seeing truth as a process and as necessarily experiential, also insist on the body. The linguistic turn of the postmodernists ultimately fails to describe the actual process of knowledge production in human individuals and groups, because it treats language (discourse) as prior to and structuring experience. This gives far too much power to language and to already-held knowledge at the moment of experience: The postmodern position ignores the embodied process of knowledge production in the first instance.
[2] Susan Haack, "Pragmatism, Old and New" in Contemporary Pragmatism Vol. 1, No. 1 (June 2004), 3-41.