O Say, What Is Truth? 28 March 2007Posted by Todd in American Pragmatism, Cognitive Science, Evolution, Philosophy of Science, Postmodernity and Postmodernism.
[Posted this on FLAK earlier today, and thought I’d cross-post it here.]
I find the American Pragmatists’ definition of truth to be the most helpful (esp., Charles Pierce, William James, and John Dewey). They were able to combine the idea that there are objective facts independent of human perception (i.e., that truth isn’t located in perception) with the idea that human perceptions of those facts changes over time (i.e., that human knowledge arises from changing experiences in their environments). They argued that, in terms of human knowledge, truth is a process and is functional. This isn’t a kind of postmodern relativism (although they were relativisitic in the narrow sense), but rather the admission that human knowledge is always incomplete. First, truth is a process because it arises in its environment in human experience, rather than existing as a Thing-in-itself. Second, it is functional, because human being know truth based on whether it “works” in their environmental experience.
P, J & D argued that science is simply a formalization and refinement of the natural way that human brains gather knowledge from and about their environments: through experiencing them and thinking about their experiences. Science merely takes that natural, biological process and makes it rigorous. But science also only works because it has built into it the notion that new experiences may bring new knowledge tomorrow.
Dewey took this a step further to argue that whereas human history is about the Quest for Certainty (i.e., humans seek to understand perfectly their environments in order to control it (a theme which has since been picked up by cognitive science and most interestingly by evolutionary psychologists)), and that philosophy & science have been about achieving Certainty. Dewey argued that, since we now understand how human brains work (he was drawing this conclusion in the 1930s, when cognitive psychology was still relatively new), and since we know that environments constantly change (which he took from Darwinism) and that our brains thereby constantly adapt to those changes, that in formal searches for truth (i.e., scientific and philosophical), we must jetisson the Quest for Certainty and embrace the fact that knowledge is always Uncertain already.
What is Known at any given time by any given group or individual is Known precisely because it Works in the environment at hand (i.e., truth as function). But that Known will constantly change as the organism (human individual or group) moves through time and the experience changes (i.e., truth as process). But the objective truth which exists independent of human perception is also knowable, if only Uncertainly and impartially, through the processes and methods of science (small-s), which are to be open to experience, hold all ideas in solution to replace them when knew information demands it, and to actively seek to understand it without ever believing you have achieved Ultimate Truth. Truth is dead the instant you think you have it and that there is nothing more that can be said; truth only works, or rather it only works Correctly, when it is understood as a Process.
Knowledge: Faith & Reason 21 December 2006Posted by Todd in Academia & Education, Democratic Theory, Ethics, Law/Courts, Philosophy of Science, Religion, Science.
My alma mater (soul mother?) University of Kansas, Hall Center for the Humanities hosted a series of lectures this fall in their Difficult Dialogues series, including an amazing range of people talking about the role of religion and science in public life. They are in RealMedia format, and so require RealOne Player. They are worth the time. I’m about 1/2 way through Judge Jones’ talk (the judge who presided over the Dover intelligent design case of last year), and will comment later. Here’s a link to the Hall Center’s web site, and the links to all the talks.
How to Spot Bad Science a Mile Away 22 August 2006Posted by Todd in Commentary, Philosophy of Science, Science, Teaching.
The Chronicle of Higher Education published a brief article in 2003 describing a rubric for determining the legitimacy of scientific claims made in public. As I read through the seven points by Dr. Parker (a physics professor), my students kept coming to mind, as they resist everything I give them. Admittedly, social science has a different dynamic, given that because of its very nature, everyone thinks they are automatically experts—you know, like, they live in society already. This semester, I’ll be teaching quite a bit of evolution and cognitive science in a course on how different cultures developed their particular views of and relationship to the ecosystems in which they live (a.k.a., nature; a.k.a., the environment), so I am braced for irritating evolution conversations and am considering using this list to discuss the issues of science in culture tomorrow on the first day of class.
1. The discoverer pitches the claim directly to the media. The integrity of science rests on the willingness of scientists to expose new ideas and findings to the scrutiny of other scientists. Thus, scientists expect their colleagues to reveal new findings to them initially. An attempt to bypass peer review by taking a new result directly to the media, and thence to the public, suggests that the work is unlikely to stand up to close examination by other scientists.
One notorious example is the claim made in 1989 by two chemists from the University of Utah, B. Stanley Pons and Martin Fleischmann, that they had discovered cold fusion — a way to produce nuclear fusion without expensive equipment. Scientists did not learn of the claim until they read reports of a news conference. Moreover, the announcement dealt largely with the economic potential of the discovery and was devoid of the sort of details that might have enabled other scientists to judge the strength of the claim or to repeat the experiment. (Ian Wilmut’s announcement that he had successfully cloned a sheep was just as public as Pons and Fleischmann’s claim, but in the case of cloning, abundant scientific details allowed scientists to judge the work’s validity.)
Some scientific claims avoid even the scrutiny of reporters by appearing in paid commercial advertisements. A health-food company marketed a dietary supplement called Vitamin O in full-page newspaper ads. Vitamin O turned out to be ordinary saltwater.
2. The discoverer says that a powerful establishment is trying to suppress his or her work. The idea is that the establishment will presumably stop at nothing to suppress discoveries that might shift the balance of wealth and power in society. Often, the discoverer describes mainstream science as part of a larger conspiracy that includes industry and government. Claims that the oil companies are frustrating the invention of an automobile that runs on water, for instance, are a sure sign that the idea of such a car is baloney. In the case of cold fusion, Pons and Fleischmann blamed their cold reception on physicists who were protecting their own research in hot fusion.
3. The scientific effect involved is always at the very limit of detection. Alas, there is never a clear photograph of a flying saucer, or the Loch Ness monster. All scientific measurements must contend with some level of background noise or statistical fluctuation. But if the signal-to-noise ratio cannot be improved, even in principle, the effect is probably not real and the work is not science.
Thousands of published papers in para-psychology, for example, claim to report verified instances of telepathy, psychokinesis, or precognition. But those effects show up only in tortured analyses of statistics. The researchers can find no way to boost the signal, which suggests that it isn’t really there.
4. Evidence for a discovery is anecdotal. If modern science has learned anything in the past century, it is to distrust anecdotal evidence. Because anecdotes have a very strong emotional impact, they serve to keep superstitious beliefs alive in an age of science. The most important discovery of modern medicine is not vaccines or antibiotics, it is the randomized double-blind test, by means of which we know what works and what doesn’t. Contrary to the saying, “data” is not the plural of “anecdote.”
5. The discoverer says a belief is credible because it has endured for centuries. There is a persistent myth that hundreds or even thousands of years ago, long before anyone knew that blood circulates throughout the body, or that germs cause disease, our ancestors possessed miraculous remedies that modern science cannot understand. Much of what is termed “alternative medicine” is part of that myth.
Ancient folk wisdom, rediscovered or repackaged, is unlikely to match the output of modern scientific laboratories.
6. The discoverer has worked in isolation. The image of a lone genius who struggles in secrecy in an attic laboratory and ends up making a revolutionary breakthrough is a staple of Hollywood’s science-fiction films, but it is hard to find examples in real life. Scientific breakthroughs nowadays are almost always syntheses of the work of many scientists.
7. The discoverer must propose new laws of nature to explain an observation. A new law of nature, invoked to explain some extraordinary result, must not conflict with what is already known. If we must change existing laws of nature or propose new laws to account for an observation, it is almost certainly wrong.
My favorites are Nos. 1, 2 and 5.
[My review of Part 2—The Virus of Faith can be found here.]
There has been much ado about Richard Dawkins’ Channel 4 two-part documentary, The Root of All Evil?, mainly because of Dawkins’ almost strident atheism and because of the relatively inflamatory title. [The video is not yet available in North America, but both parts are currently downloadable from Google Video, Part 1 here and Part 2 here.] Having been raised in a pretty orthodox Mormon household and having family on both sides who are quite religious now, I tend to be less afraid of religiosity in general than Dawkins seems to be. And I do sympathize with the religious impulse, the desire to beleive in something greater, for an explanation of both the uncertainty and fickleness of life as well as the disappointment with the realities of our existence.
When I realized I no longer believed in God, I found myself with twin wounds, one left by the loss of community, the other by the loss of submission to something greater. Dawkins seems to miss these dynamics completely, the importance of communal bonds and identity formation in people’s desire for and attachment to their religious beliefs. On Bill Moyers’ new series, On Faith & Reason, Collin McGinn said that when he left faith behind he found the world without God to be so much more vibrant and rich than it ever was with God. Although I did also eventually arrive at that conclusion, the years it took me to separate myself from religion were painful and transformed my most basic world view. The difficulty in replacing one’s world view and/or accepting the full implications of rationality and science can be quite overwhelming, but the documentary presents Reason as an easy englightenment, to which folks should easily convert.
So the main problem I had with the documentary emerges from my personal experience combined with my training as a sociologist: Dawkins doesn’t seem to fully understand how and why religion has the power it does on people, the role that it actually plays in people’s lives to give them meaning. All he seems to be able to see is its irrationality and anti-scientific mindset, along with the horrifying moral consequences of such belief. I had no qualms or disagreement with Dawkins on these points, but the documentary seemed to set up two categories of religion and science without addressing the complexities of why people believe in the first place and why it can be so hard for an individual, emotionally, socially and psychologically, to leave a faith-community. An exploration of these dynamics can help us understand more deeply why people refuse the evidences of science and rational argument; and more importantly it could help us understand to have more productive dialogues with the faithful, something of utmost importance if we are going to save our democracies around the world from collapsing into theocracies.
Another quibble I had was that the documentary painted religion with such a big brush that suicide bombers and rabid fundamentalists are lumped in with the millions of religious who fight injustice, hunger, and violence world wide. Human religions are vastly diverse and have multiple and contradictory consequences in the real world. It is problematic to ignore these deeply moral aspects to many of the world’s religious. I don’t point this out as an apology for religion, but rather to insist on seeing religion as a form of culture in all its complexity. Dawkins’ points about rationality and science stand even in the face of the morally positive aspects of religion.
[Dawkins has responded to many aspects of these and other criticisms in The New Statesman and in a great interview with the Infidel Guy.]
In all other aspects, I found the documentary to be a solid explanation of why scientific thinking and rational thought should prevail over religious belief, especially in the public sphere. Dawkins’ discussions with the likes of Ted Haggard illustrate clearly the problems of having rational discourse with some kinds of faithful. Haggard refuses the most basic premises of rational thinking and evidentiation of argument and insists, in an odd religious postmodern twist, that all ideas are of equal value and should be given equal time. He even goes so far as to accuse Dawkins of arrogance for making scientific assertions. In another interview on Point of Inquiry, Dawkins points out the arrogance is actually making assertions for which you have no evidence whatsoever and expecting that no one will criticize your position.
As I’ve been musing lately about the merits of rationality and especially about my own work in social theory and method, I find myself frustrated by the simple fact that many people simply, willfully refuse to accept the basic mode of rational thinking. McGinn pointed out that both the academic left and the religious right have been assailing rational thought in an odd sort of allegiance for the past 30 years, where on one hand postmodern philosophy and on the other fundamentalism make similar claims that require belief without evidence and refuse the most basic of rules of logic and empirical reasoning. It may simply be that it is impossible to have that discussion where those premises are not shared. For the academic left, perhaps more empirically and rationally minded researchers can work harder to actively engage in advocating the methods of rational inquiry; and perhaps for the religious right, the best we can do is continue unceasingly to fight for the fundamental principles of democracy that would allow them their religiosity without infringing on social progress. One debate, on the left, is ongoing and will probably work itself as postmodernism continues to lose its caché outside of the humanities; but with Dawkins, I do fear the power of the fundamentalist mind whose morality is clear and justifies violence and coercion to remake society in his or her image.
On Knowing (More on Religion vs. Science) 25 March 2006Posted by Todd in American Pragmatism, Cognitive Science, Philosophy of Science, Religion, Science.
Among my most ardent interests is the study of how human beings know. I had never thought too much about the question until I began studying John Dewey in 1998. For Dewey, human knowledge is necessarily embodied and experiential. He called it the organism-environment model, where the embodied individual knows only in transaction with its environment, and where for humans, environment is broadly construed to include the social (other humans) and cultural (symbolic, meaningful, and linguistic) elements of experience.
The traditional philosophical epistemology was based in what Dewey called the "Specator Model" of knowledge, where philosophers (think: Plato) knew stuff from the "outside" as a disembodied spectator. Dewey is making a key distinction between objective reality (which exists independent of human experience) and our knowledge of reality (which is necessarily experiential). In the spectator model, the reality is knowable unmediated by our bodies and experiences, thereby lending authority to the claims of philosophers who "know" it, and setting up a foil against which unreal, false, and situated knowledge could be compared. To the contrary, Dewey (and William James, and Charles Pierce, and George Herbert Mead) argued that whether you are a philosopher, a scientist, an engineer, a farmer, a hunter-gatherer, or a housewife, you knowledge comes from the same place: in a transaction with your environment—it is always mediated through experience.
James called this "pure experience," where the "vital flux of life" becomes the very raw stuff with which and about which we think; and the process of thinking about it produces any number of "objects" that we create to make sense of and to manipulate and change our environment.
Recently, my good friend Matt, in responding to my "Evolution of God" post of last weekend, raised important issues in the experience of knowing:
In other words, the things that we know (in this case mathematics) do not arise out of an embodied experience but exist independent of the human mind. 1+1=2 even if humans don't exist and even if it can't be proven without some fundamental intuitions about the nature of 1.
For me, there are actually two separate issues here. First, is their an objective reality outside of human experience? And second, how do human beings know that reality? On the first issue, this is a variation on the oldie but goodie, "If a tree falls in the forest and no one is around to hear, does it make a noise?" Is there an objective reality of the tree falling and of the air molecules compressing and spreading outward from the event in such a way as to create noise, or does noise only exist because we hear it? It is vital to understand that there are objective realities out there which are not directly experienced by any human observer, which nonetheless exist and are real. Ironically, our own experience tells us this is so. We can come upon the fallen log 100 years later and observe the consequences of its fall. The tree fell independent of human mind.
The second issue, however, is for me the interesting question. It isn't whether or not the tree actually made noise, but how we would know one way or the other. Or to use Matt's example, the interesting question isn't whether or not 1+1 exists outside of human experience, but rather, how human beings come to know that 1+1=2 and that it exists independent of our experience.
It is the process of knowing that is of issue. What is key here in the science and religion "controversy" is that human knowledge comes from the same place, regardless or whether you're trying to figure out how to build a $1.3 billion bridge from Oakland to Yerba Buena Island, or trying to figure out how to teach your kids to look both ways before crossing the street, or trying to figure out how mollusks breed, or trying to understand the emptiness you feel when you're home alone. From philosophy to theology to biology to parenting to engineering to athletics to auto mechanics, human knowing is necessarily embodied and experiential.
I've been reading an article on the development of American Pragmatism and its transformation into Neo-Pragmatism in the past couple of decades, and I stumbled upon this explanation of Charles Pierce's definition of belief, and I think it gives a nice summary of my points above:
[Pierce] construes belief as involving habits of action, and doubt as the unsettled state resulting from the interruption of a belief-habit by recalcitrance on the part of experience. This real, living doubt, unlike Cartesian paper doubt, is both involuntary and disagreeable. The primitive basis of the most sophisticated human cognitive activity is a homeostatic process in which the organism strives to return to equilibrium, a process beginning with doubt and ending when a new habit, a revised belief, is reached.
Peirce compares four methods for the “fixation of belief.” The method of tenacity is simply to cling obstinately to whatever beliefs you have, avoiding any evidence that might unsettle them. The method of authority is to have a church or state impose conformity in belief. The a priori method — the method traditionally favored by metaphysicians — seeks to settle belief by inquiring what is “agreeable to reason.” But the most sophisticated inquirers, aspiring to indefeasibly settled belief, will always be motivated to further inquiry, never fully satisfied with what they presently incline to think. So the culmination of that primitive homeostatic process is the scientific method — the only method, Peirce argues, that would eventually result in beliefs which are indefeasibly stable, permanently safe from recalcitrance.
If you really want to learn the truth, you must acknowledge that you don’t satisfactorily know already; so the scientific inquirer is a “contrite fallibilist” (CP 1.14) ready to “drop the whole cartload of his beliefs, the moment experience is against them” (CP 1.55). In the Critical Common-sensism of Peirce’s mature philosophy — an attempted synthesis, as the phrase suggests, of Kant’s and Reid’s responses to Hume — the scientific inquirer is seen as submitting the instinctive beliefs of common sense to criticism, refinement, and revision.
What the pragmatists are arguing isn't that we give up theological or philosophical debates, but rather that we move past the old and untrue epistemologies and ontologies that make us think we are pronouncing ultimate truth. In our philosophical and theological and literary and artistic debates, and more importantly, in our values and moral debates, we must acknowledge where our knowledge comes from and approach our meaningful questions—who am I, why am I here, what is the meaning of life—beginning with a "scientific mindset" (as Dewey called it) or a "scientific attitude" (as Pierce called it). This doesn't mean that we cede all knowledge production to scientists, rather it means that we approach our quests for knowledge with the understanding of the limits of our knowledge and how we get it; that when we make existential or moral or theological claims, they be made by giving reasons and arguments and evidence; that they be explicitly anchored in our experience in this world rather than in our efforts to create certainty and stability by generating knowledge that doesn't correspond to our experiences.
The human capacity to imagine and to think, to begin with experience, create thought-objects, and then imagine all their possibilities—our ability to see thought-objects as infinite possible means—allows us to imagine ourselves into cultural structures that are maladaptive, when we rely on our tenacity, on authority, or an a priori knowledge without accounting for our experience. At best, such disconnected knowledge-systems (cultures) can be merely odd; at worst they can be immoral and even violent. In the pluralistic world we live in today, nothing could be more dangerous.
 This is a similar mistake to that made by much of postmodernism and poststructural theory. Where both pragmatic and postmodern theories of knowledge are anti-foundationalist (i.e., deconstructed)—that is, there is no correspondance between signified and signifier—the postmodernists end up making the same mistakes as Plato and most Western philosophers. They deal with knowledge as if it exists outside of lived experiences. Derrida and the social theorists who rely on him, such as Judith Butler, all explain the change of knowledge as originating in the non-correspondance of the signifier. And in sociology, the cultural sociologists make a like mistake, assuming that 'culture' exists independent of human bodies and experiences. The pragmatists, which seeing truth as a process and as necessarily experiential, also insist on the body. The linguistic turn of the postmodernists ultimately fails to describe the actual process of knowledge production in human individuals and groups, because it treats language (discourse) as prior to and structuring experience. This gives far too much power to language and to already-held knowledge at the moment of experience: The postmodern position ignores the embodied process of knowledge production in the first instance.
 Susan Haack, "Pragmatism, Old and New" in Contemporary Pragmatism Vol. 1, No. 1 (June 2004), 3-41.