Here's a bit from a paper forthcoming in Faith and Philosophy. I put it up because it concerns a certain move that's often made re evidence of miracles - that whether it's sensible to accept testimony of the miraculous depends on ones "presuppositions" or "prior commitments". This phrase just cropped up in a slightly bad-tempered interchange I am currently having with Glenn Peoples here.
The Ted and Sarah case
Suppose I have two close friends, Ted and Sarah, whom I know to be generally sane and trustworthy individuals. Suppose that Ted and Sarah now tell me that someone called Bert paid them an unexpected visit in their home last night, and stayed a couple of hours drinking tea with them. They recount various details, such as topics of conversation, what Bert was wearing, and so on. Other things being equal, it is fairly reasonable for me to believe, solely on the basis of their testimony, that such a visit occurred.
But now suppose Ted and Sarah also tell me that shortly before leaving, Bert flew around their sitting room by flapping his arms, died, came back to life again, and finished by temporarily transforming their sofa into a donkey. Ted and Sarah appear to say these things in all sincerity. In fact, they seem genuinely disturbed by what they believe they witnessed. They continue to make these claims about Bert even after several weeks of cross-examination by me.
Am I justified in believing that Ted and Sarah witnessed miracles? Surely not. The fact that Ted and Sarah claim these things happened is not nearly good enough evidence. Their testimony presents me with some evidence that miracles were performed in their living room; but, given the extraordinary nature of their claims, I am not yet justified in believing them.
Notice, incidentally, that even if I am unable to construct a plausible explanation for why these otherwise highly trustworthy individuals would make such extraordinary claims – it’s implausible, for example, that Ted and Sarah are deliberate hoaxers (for this does not fit at all with what I otherwise know about them), or are the unwitting victims of an elaborate hoax (why would someone go to such extraordinary lengths to pull this trick?) – that would still not lend their testimony much additional credibility. Ceteris paribus, when dealing with such extraordinary reports – whether they be about alien abductions or supernatural visitations – the fact that it remains blankly mysterious why such reports would be made if they were not true does not provide us with very much additional reason to suppose that they are true.
Consideration of the Ted and Sarah case suggests something like the following moral:
P1 Where a claim’s justification derives solely from evidence, extraordinary claims (e.g. concerning supernatural miracles) require extraordinary evidence. In the absence of extraordinary evidence there is good reason to be sceptical about those claims.
The phrase “extraordinary claims require extraordinary evidence” is associated particularly with the scientist Carl Sagan . By “extraordinary evidence” Sagan means, of course, extraordinarily good evidence – evidence much stronger than that required to justify rather more mundane claims. The phrase “extraordinary claims” is admittedly somewhat vague. A claim need not involve a supernatural element to qualify as “extraordinary” in the sense intended here (the claims that I built a time machine over the weekend, or was abducted by aliens, involve no supernatural element, but would also count as “extraordinary”). It suffices, for our purposes, to say that whatever “extraordinary” means here, the claim that a supernatural miracle has occurred qualifies.
Some theists (though of course by no means all) have challenged the application of Sagan’s principle to religious miracles, maintaining that which claims qualify as “extraordinary” depends on our presuppositions. Suppose we begin to examine the historical evidence having presupposed that there is no, or is unlikely to be a, God. Then of course Jesus’ miracles will strike us as highly unlikely events requiring exceptionally good evidence before we might reasonably suppose them to have occurred. But what if we approach the Jesus miracles from the point of view of theism? Then that such miraculous events should be a part of history is not, one might argue, particularly surprising. But then we are not justified in raising the evidential bar with respect to such claims. So theists may, after all, be justified in accepting such events occurred solely on the basis of a limited amount of testimony, just as they would be the occurrence of other unusual, but non-supernatural, events. The application of Sagan’s principle that “extraordinary claims require extraordinary evidence” to the Jesus miracles simply presupposes, prior to any examination of the evidence, that theism is not, or is unlikely to be, true. We might call this response to Sagan’s principle the Presuppositions Move.
That there is something awry with the Presuppositions Move, at least as it stands, is strongly suggested by the fact that it appears to license those of us who believe in Big Foot, psychic powers, the activities of fairies, etc. to adopt the same strategy – e.g. we may insist that we can quite reasonable accept, solely on the basis of Mary and John’s testimony, that fairies danced at the bottom of their garden last night, just so long as we presuppose, prior to any examination of the evidence, that fairies exist. Those making the Presuppositions Move with respect to religious miracles may be prepared to accept this consequence, but I suspect the majority of impartial observers will find it a lot to swallow – and indeed will continue to consider those who accept testimony of dancing fairies to be excessively credulous whether those believers happen to hold fairy-istic presuppositions or not.
I suspect at least part of what has gone wrong here is that, when it comes to assessing evidence for the Jesus miracles and other supernatural events, we do so having now acquired a great deal of evidence about the unreliability of testimony supposedly supporting such claims. We know – or at least ought to know by now – that such testimony is very often very unreliable (sightings of ghosts, fairies, and of course, even religious experiences and miracles, are constantly being debunked, exposed as fraudulent, etc.). But then, armed with this further knowledge about the general unreliability of this kind of testimony, even if we do happen to approach such testimony with theistic or fairy-istic presuppositions, surely we should still raise the evidential bar much higher for eye-witness reports of religious miracles or fairies than we do for more mundane claims.
{{ENDOTE It may be said that there is a relevant disanalogy between the application of the Presuppositions Move with respect to religious miracles and to fairies. We have now acquired good empirical evidence that there’s no such thing as fairies. Starting off an assessment of the empirical evidence with the presupposition that fairies exist is one thing. Retaining that presupposition in the teeth of empirical evidence to the contrary is quite another. The Presuppositions Move surely requires that we have come across no body of empirical evidence throwing into serious doubt the existence of what we have been presupposing exists. This blocks the application of the Presuppositions Move in defence of accepting testimony regarding fairies. However, while there’s good empirical evidence that there’s no such thing as fairies, there’s no such evidence against the existence of God. Thus the Move can still be made with respect to testimony of religious miracles.
An obvious difficulty with the above suggestion is the evidential problem of evil (for an assessment, see my “The Evil God Hypothesis” in Religious Studies 46 (2010), 353-373). Prima facie there is good empirical evidence that there is no God. In which case, the above suggestion looks to be no less an obstacle to the use of the Presuppositions Move with respect to religious miracles. So, prior to employing the Move, those theists insisting on the above disanalogy will need to come up with an adequate solution to the evidential problem of evil (a solution not dependent on the truth of religious miracle claims) – not an easy task.
END OF ENDNOTE]]
So, my suggestion is that P1 is, prima facie, a fairly plausible principle – a principle that is applicable to the testimony concerning the miracles of Jesus. Note that P1 at least allows for the possibility that we might reasonably suppose a miracle has happened. Of course, I do not claim to have provided anything like proof of P1. But it does appear fairly accurately to reflect one of the ways in which we assess evidence. We do, rightly, set the evidential bar much higher for extraordinary claims than we do for more mundane claims.
Wednesday, 25 August 2010
Sunday, 22 August 2010
The Third Wave
This might be worth introducing int a few British classrooms...
Remembering the 3rd Wave by Leslie Weinfield
Peninsula, September 1991
Although the specter of fascist resurgence seems largely forgotten in the euphoria of German reunification, it may not be far beneath the peaceful veneer of that nation, or any other, for that matter. Even the most ostensibly free and open societies are not immune to fascism's lure - including places like Palo Alto.
What came to be known as the "Third Wave" began at Cubberly High School in Palo Alto as a game without any direct reference to Nazi Germany, says Ron Jones, who had just begun his first teaching job in the 1966-67 academic year. When a social studies student asked about the German public's responsibility for the rise of the Third Reich, Jones decided to try and simulate what happened in Germany by having his students "basically follow instructions" for a day.
But one day turned into five, and what happened by the end of the school week spawned several documentaries, studies and related social experiments illuminating a dark side of human nature - and a major weakness in public education.
Before students arrived for class on Monday, Jones vigorously cleaned his classroom and arranged the desks in unusually straight rows. He dimmed the lights and played Wagnerian music as students drifted in for class. Then Jones, a popular instructor who normally avoided even such regimentation as taking roll, told his students that he could give them the keys to power and success - "Strength Through Discipline."
"It was thoroughly out of character for Ron Jones to say "Let's help the class out with a little more discipline," recalls a former student Philip Neel, now a television producer in Los Angeles. But because Jones was an interesting teacher, the class went along.
Classmate Mark Hancock remembers Jones adding a political cast and a set of incentives soon thereafter. "It was something like, if you're a good party member and play the game well, you can get an A. If you have a revolution and fail, you get an F. For a successful revolution, you get an A," recounts Hancock, currently a regional development director for a Los Angeles property company.
Jones next commanded the class to assume a new seating posture to strengthen student concentration and will: feet flat on the floor, hands across the small of the back, spines straight. And he added speed drills, after which the entire group could move from loitering outside the room to silent, seated attention in less than 30 seconds.
"Even when we started with Strength Through Discipline, it was easy for me to see the benefits of the posture," remarks Steve Coniglio, who now helps run a Truckee retail store. "Even on that very first day, I could notice that I was breathing better. I was more attentive in class."
Jones closed the first day's session with a few rules. Students had to be sitting at attention before the second bell, had to stand up to ask or answer questions and had to do it in three words or less, and were required to preface each remark with "Mr. Jones."
"At the end of that day, I was grandly happy. I mean, it seemed to work and everyone seemed to get into it," Jones still marvels. Grades were based on participation, and no one accepted the study hall alternative that Jones offered prior to commencing the exercise that day. But neither did anyone make a connection to the German history lessons they'd just completed. "Most of us were headed toward college," says Hancock. "It wasn't Nazi German life that mattered, it was Palo Alto grades."
Jones says he assumed the class would return to its usual format the next day. "But when I came in, the class was all sitting..." His voice trails off as his body snaps to military attention.
Jones considered calling a halt, but then went to the blackboard and wrote "Strength Through Community" below the previous day's slogan, "Strength Through Discipline."
"I began to lecture on community - something bigger than oneself, something enjoyable. They really bought that argument," Jones recalls.
A powerful sense of belonging had sprung up among lowly sophomores at the bottom of the rung of the three-year school, and Jones admits he soon became a part of the exercise as well as its leader.
"It was really a mistake, a terrible thing to do. My curiosity pulled me in at first, and then I liked it. They learned fast, didn't ask questions. It was easier as a teacher."
As his Strength Through Community lecture ended, he created a class salute by bringing his right hand toward his right shoulder in an outwardly curled position, resembling a wave. Jones named it the Third Wave, and - despite its similarity to Third Reich - claims he borrowed the term from beach folklore, which holds that the last wave in every series of three is the largest.
Students acknowledging each other this way in the halls attracted the attention of upper classmen, who clamored to know the salute's significance, Coniglio says. Cubberley students began skipping their regular classes, asking to be part of the Third Wave. In three days Jones' class had expanded to 60 students.
After telling the enlarged class that "strength is fine, now you must act," Jones assigned everyone a task to be completed that day. Some were to memorize the names and addresses of everyone in the group; others were to make Third Wave banners, armbands and membership cards. And since that day's theme was "Strength Through Action," everyone was to proselytize.
By day's end Coniglio says banners were all over the school, including a 20 footer in the library. Members brought in some 200 converts from other classes to be "sworn in."
"It just swept through the school," recalls Jones, who is still teaching, now at the San Francisco Recreation Center for the Handicapped. "It was like walking on slippery rock...by the third or fourth day, there was an obvious explosion of emotion that I couldn't control."
Several boys were assigned to "protect" Jones as he walked the school's corridors, wearing Third Wave armbands to signify their responsibility.
"It was a black band. When I went home, it got my parents worried," says Steve Benson, now a Palo Alto mechanic. "They thought it was the equivalent of the SS." Although his mother called Jones to express her concern, the teacher reassured her it was merely a class exercise.
Everyone involved in the Third Wave received a membership card, three of which Jones randomly marked with an X. Those holding the marked cards were told to note who transgressed class rules, which now dictated such matters as what campus paths members could walk and with whom they could associate.
"There were three or four stoolies," Jones explains bluntly. "I wanted to see how this was being taken outside of class."
By the end of four days, approximately half the class had approached Jones with detailed information about the transgressions of others, ranging from improper salutes to coup plots against him.
"It was phenomenal. There was a whole underground of activity. People were assigning themselves as guards," Jones says. "I knew exactly what was going on in class because of this strange snitching that was going on."
There was betrayal among teens who had been close friends since childhood. A group of buddies could be sharing a cigarette in the bathroom, discussing a plan to "kidnap" Jones the next day and fulfill the exercise's requirement for a top grade, but "it wouldn't happen," say Coniglio. "Somebody - one of those two or three - would inform Ron Jones of the plot."
Continue reading here.
Remembering the 3rd Wave by Leslie Weinfield
Peninsula, September 1991
Although the specter of fascist resurgence seems largely forgotten in the euphoria of German reunification, it may not be far beneath the peaceful veneer of that nation, or any other, for that matter. Even the most ostensibly free and open societies are not immune to fascism's lure - including places like Palo Alto.
What came to be known as the "Third Wave" began at Cubberly High School in Palo Alto as a game without any direct reference to Nazi Germany, says Ron Jones, who had just begun his first teaching job in the 1966-67 academic year. When a social studies student asked about the German public's responsibility for the rise of the Third Reich, Jones decided to try and simulate what happened in Germany by having his students "basically follow instructions" for a day.
But one day turned into five, and what happened by the end of the school week spawned several documentaries, studies and related social experiments illuminating a dark side of human nature - and a major weakness in public education.
Before students arrived for class on Monday, Jones vigorously cleaned his classroom and arranged the desks in unusually straight rows. He dimmed the lights and played Wagnerian music as students drifted in for class. Then Jones, a popular instructor who normally avoided even such regimentation as taking roll, told his students that he could give them the keys to power and success - "Strength Through Discipline."
"It was thoroughly out of character for Ron Jones to say "Let's help the class out with a little more discipline," recalls a former student Philip Neel, now a television producer in Los Angeles. But because Jones was an interesting teacher, the class went along.
Classmate Mark Hancock remembers Jones adding a political cast and a set of incentives soon thereafter. "It was something like, if you're a good party member and play the game well, you can get an A. If you have a revolution and fail, you get an F. For a successful revolution, you get an A," recounts Hancock, currently a regional development director for a Los Angeles property company.
Jones next commanded the class to assume a new seating posture to strengthen student concentration and will: feet flat on the floor, hands across the small of the back, spines straight. And he added speed drills, after which the entire group could move from loitering outside the room to silent, seated attention in less than 30 seconds.
"Even when we started with Strength Through Discipline, it was easy for me to see the benefits of the posture," remarks Steve Coniglio, who now helps run a Truckee retail store. "Even on that very first day, I could notice that I was breathing better. I was more attentive in class."
Jones closed the first day's session with a few rules. Students had to be sitting at attention before the second bell, had to stand up to ask or answer questions and had to do it in three words or less, and were required to preface each remark with "Mr. Jones."
"At the end of that day, I was grandly happy. I mean, it seemed to work and everyone seemed to get into it," Jones still marvels. Grades were based on participation, and no one accepted the study hall alternative that Jones offered prior to commencing the exercise that day. But neither did anyone make a connection to the German history lessons they'd just completed. "Most of us were headed toward college," says Hancock. "It wasn't Nazi German life that mattered, it was Palo Alto grades."
Jones says he assumed the class would return to its usual format the next day. "But when I came in, the class was all sitting..." His voice trails off as his body snaps to military attention.
Jones considered calling a halt, but then went to the blackboard and wrote "Strength Through Community" below the previous day's slogan, "Strength Through Discipline."
"I began to lecture on community - something bigger than oneself, something enjoyable. They really bought that argument," Jones recalls.
A powerful sense of belonging had sprung up among lowly sophomores at the bottom of the rung of the three-year school, and Jones admits he soon became a part of the exercise as well as its leader.
"It was really a mistake, a terrible thing to do. My curiosity pulled me in at first, and then I liked it. They learned fast, didn't ask questions. It was easier as a teacher."
As his Strength Through Community lecture ended, he created a class salute by bringing his right hand toward his right shoulder in an outwardly curled position, resembling a wave. Jones named it the Third Wave, and - despite its similarity to Third Reich - claims he borrowed the term from beach folklore, which holds that the last wave in every series of three is the largest.
Students acknowledging each other this way in the halls attracted the attention of upper classmen, who clamored to know the salute's significance, Coniglio says. Cubberley students began skipping their regular classes, asking to be part of the Third Wave. In three days Jones' class had expanded to 60 students.
After telling the enlarged class that "strength is fine, now you must act," Jones assigned everyone a task to be completed that day. Some were to memorize the names and addresses of everyone in the group; others were to make Third Wave banners, armbands and membership cards. And since that day's theme was "Strength Through Action," everyone was to proselytize.
By day's end Coniglio says banners were all over the school, including a 20 footer in the library. Members brought in some 200 converts from other classes to be "sworn in."
"It just swept through the school," recalls Jones, who is still teaching, now at the San Francisco Recreation Center for the Handicapped. "It was like walking on slippery rock...by the third or fourth day, there was an obvious explosion of emotion that I couldn't control."
Several boys were assigned to "protect" Jones as he walked the school's corridors, wearing Third Wave armbands to signify their responsibility.
"It was a black band. When I went home, it got my parents worried," says Steve Benson, now a Palo Alto mechanic. "They thought it was the equivalent of the SS." Although his mother called Jones to express her concern, the teacher reassured her it was merely a class exercise.
Everyone involved in the Third Wave received a membership card, three of which Jones randomly marked with an X. Those holding the marked cards were told to note who transgressed class rules, which now dictated such matters as what campus paths members could walk and with whom they could associate.
"There were three or four stoolies," Jones explains bluntly. "I wanted to see how this was being taken outside of class."
By the end of four days, approximately half the class had approached Jones with detailed information about the transgressions of others, ranging from improper salutes to coup plots against him.
"It was phenomenal. There was a whole underground of activity. People were assigning themselves as guards," Jones says. "I knew exactly what was going on in class because of this strange snitching that was going on."
There was betrayal among teens who had been close friends since childhood. A group of buddies could be sharing a cigarette in the bathroom, discussing a plan to "kidnap" Jones the next day and fulfill the exercise's requirement for a top grade, but "it wouldn't happen," say Coniglio. "Somebody - one of those two or three - would inform Ron Jones of the plot."
Continue reading here.
My Children's Event, Edinburgh Lit. Festival 29th August
My event is 1pm 29th August, Edinburgh Festival. Tickets here.
Really Big Questions with Stephen Law
VENUE : Charlotte Square Gardens
FESTIVAL: Edinburgh International Book Festival
CATEGORY: Children
£4.00
Really Big Questions with Stephen Law
VENUE : Charlotte Square Gardens
FESTIVAL: Edinburgh International Book Festival
CATEGORY: Children
£4.00
Monday, 16 August 2010
Interview.
Interview with me available here if you are interested - at common sense atheism. It's about 40 mins long.
Friday, 13 August 2010
Plantinga on evolution and naturalism
I just came across pharyngula's criticism of Plantinga's short, sweet version of his argument against naturalism. It is here if you are interested.
Plantinga still runs the following type of argument that false beliefs are just as adapative as true beliefs, and so evolution won't particularly favour true-belief-producing mechanisms:
Consider a frog sitting on a lily pad. A fly passes by; the frog flicks out its tongue to capture it. Perhaps the neurophysiology that causes it to do so, also causes beliefs. As far as survival and reproduction is concerned, it won't matter at all what these beliefs are: if that adaptive neurophysiology causes true belief (e.g., those little black things are good to eat), fine. But if it causes false belief (e.g., if I catch the right one, I'll turn into a prince), that's fine too.
Plantinga still runs the following type of argument that false beliefs are just as adapative as true beliefs, and so evolution won't particularly favour true-belief-producing mechanisms:
Consider a frog sitting on a lily pad. A fly passes by; the frog flicks out its tongue to capture it. Perhaps the neurophysiology that causes it to do so, also causes beliefs. As far as survival and reproduction is concerned, it won't matter at all what these beliefs are: if that adaptive neurophysiology causes true belief (e.g., those little black things are good to eat), fine. But if it causes false belief (e.g., if I catch the right one, I'll turn into a prince), that's fine too.
Thursday, 12 August 2010
Drumming
Time wasting this afternoon trying different tunings on snare drum - medium/high, then high...
Chris Hallquist's book
I have been reading Chris Hallquist's UFOs, Ghosts and a Rising God, and I must say that, while I initially approached it with caution (I guess because the title makes it sound like it belongs on the shelves of a New Age bookstore), it is very well researched and written. The arguments are very strong. And entertaining, I should add. It's a discussion of the testimony-based evidence for the resurrection in light of what we know about other cults, testimony made about ghosts, alien abduction and so on.
No doubt many Biblical scholars would consider a close look at claims about ghosts and UFOs to be beneath them; but, actually, these are precisely the sort of claims they need to know more about if they are to have a genuinely balanced view of the historical evidence for the resurrection.
No doubt many Biblical scholars would consider a close look at claims about ghosts and UFOs to be beneath them; but, actually, these are precisely the sort of claims they need to know more about if they are to have a genuinely balanced view of the historical evidence for the resurrection.
Wednesday, 11 August 2010
Science vids
This is a good resource for science teachers and parents with science-y kids... It includes this:
I also found this on diet coke and mentos.
I also found this on diet coke and mentos.
Tuesday, 10 August 2010
Intro to book (part 1) new draft
Here is a new version of first part of the intro, for comments please. Too "academic" (remember - this has to become a best-seller and make me a fortune)? How could I make it more snappy and appealing?
INTRODUCTION
Intellectual black holes
Wacky and ridiculous belief systems abound. The Heaven’s Gate suicide cult promised members a ride to heaven on board a UFO. Advanced students of scientology are taught that 75 million years ago, Xenu, alien ruler of a “Galactic Confederacy”, brought billions of people to Earth in spacecraft shaped like Douglas DC-10 airplanes and stacked them around volcanoes which he then blew up with hydrogen bombs. Even mainstream religions have people believing absurdities. Preachers have promised 72 heavenly virgins to suicide bombers. Others insist the entire universe is just 6,000 years old (extraordinarily, polls consistently indicate this belief is currently held by about 45% of US citizens – that’s around 130 million individuals). And of course it’s not only cults and religions that promote bizarre beliefs. Significant numbers of people believe in astrology, the amazing powers of TV psychics, astrology, crystal divination, the healing powers of magnets, the prophecies of Nostradamus, that the pyramids were built by aliens, that the Holocaust never happened, and that the World Trade Centre was brought down by the US Government.
How do such ridiculous views succeed in entrenching themselves in people’s minds? How are wacky belief systems able to take sane, intelligent, college-educated people and turn them into the willing slaves of claptrap? How, in particular, do the true believers manage to convince themselves that they are the rational, reasonable ones and that everyone else is deluded?
This book identifies eight key mechanisms that can transform a set of ideas into a psychological fly trap – a bubble of belief that, while seductively easy to enter, can be almost impossible to reason your way out of again.
Cosmologists talk about black-holes, objects so gravitationally powerful that nothing, not even light, can escape from them. Unwary space travellers passing too close will find themselves sucked in. An increasingly powerful motor is required to resist its pull, until eventually one passes the “event horizon” and escape becomes impossible.
My suggestion is that our contemporary cultural landscape contains, if you like, numerous intellectual black-holes – belief systems constructed in such a way that unwary passers-by can similarly find themselves drawn in. While those of us lacking robust intellectual and other psychological defences will be most easily trapped by such self-sealing bubbles of belief, even the most intelligent and educated of us are potentially vulnerable. Some of the world’s greatest thinkers have fallen in, never to escape. If you find yourself encountering a belief system in which several of these mechanisms feature prominently, be wary. Alarm bells should be going off and warning lights flashing. For you may now be approaching the event horizon of an intellectual black hole.
As the centuries roll by, such self-sealing bubbles of belief appear and disappear. When the conditions are right, several may fizz into existence all at once. Occasionally, one may grow huge, perhaps encompassing an entire civilization before dividing or deflating or popping or being subsumed by another bubble.
One of the greatest threats an intellectual black hole faces is the flourishing of a culture that promotes reason and encourages individuals to subject both their own and others’ beliefs to critical scrutiny - with none deemed off-limits. However, bubbles of irrationality are certainly able to survive and flourish even within such a society. Intellectual black holes are, perhaps, an ineradicable feature of human civilization. Which is not to say we should not do our level best to keep them under control.
Aim of this book
The central aim of this book is to help immunize readers against the wiles of cultists, religious nutcases, political zealots, conspiracy theorists, promoters of flaky alternative medicines, and so on by clearly setting out the tricks of the trade by which such self-sealing bubbles of belief are created and maintained. We will see how an intellectually impregnable fortress can be constructed around a set of even patently ridiculous beliefs, providing them with a veneer of “reasonableness” and rendering them immune to rational criticism.
Most of us will have at some point experienced the frustration and of trying to change the convictions of someone powerfully committed to a ridiculous set of beliefs, and will have come up against at least some of these kinds of strategy. My aim here is to provide an overview of eight key strategies, which I call:
1. Playing the mystery card
2. “But it fits!” and the blunderbus of crap
3. Moving the goal posts
4. Going nuclear
5. ” I just know!”
6. Pseudo-profundity
7. The Amazingly Persuasive Power of Accumulated Anecdote (TAPPAA)
8. Pressing your buttons
In each case I (i) explain the strategy, (ii) diagnose exactly what is wrong with it, and (ii) provide illustrations of how it is applied. We will find out exactly how quacks peddling dubious medicines, cultists promoting nutty dogma, conspiracy theorists who insist we’re ruled by aliens apply these techniques, so you’ll be better able to spot them. Or, if you want to set yourself up as chief guru of your own cult, start your own conspiracy theory, or whatever, how you can employ the techniques in order to dupe others (a the end of the book are a series of letters illustrating just how charlatans can deliberately can and do use them).
Talking to a victim of an intellectual black hole can be a rather creepy experience. The intellectual abilities of such a person are strangely hobbled, but the victim is probably entirely unware that they have been hobbled. Point out to them the flaws in their thinking. Try to show them evidence against what they believe. You’ll find their minds will just continue to slide around it. It’s as if they have a mental blind spot. Not only won’t they “get it” even when “it” has been made as plane as the nose on their face, they will have no inkling that there’s anything they’re missing out on. Victims typically exhibit what has been termed the Dunning-Kruger effect – where a person’s lack of knowledge or expertise in an area not only makes them inadequate, but also keeps them from discovering their own inadequacy. Dunning draws a parallel between the Dunning-Kruger effect and peculiar medical condition called anosognostic paralysis:
An anosognosic patient who is paralyzed simply does not know that he is paralyzed. If you put a pencil in front of them and ask them to pick up the pencil in front of their left hand they won’t do it. And you ask them why, and they’ll say, “Well, I’m tired,” or “I don’t need a pencil.” They literally aren’t alerted to their own paralysis. There is some monitoring system on the right side of the brain that has been damaged, as well as the damage that’s related to the paralysis on the left side. There is also something similar called “hemispatial neglect.” It has to do with a kind of brain damage where people literally cannot see or they can’t pay attention to one side of their environment. If they’re men, they literally only shave one half of their face. And they’re not aware about the other half. If you put food in front of them, they’ll eat half of what’s on the plate and then complain that there’s too little food. You could think of the Dunning-Kruger Effect as a psychological version of this physiological problem. If you have, for lack of a better term, damage to your expertise or imperfection in your knowledge or skill, you’re left literally not knowing that you have that damage.
Someone sucked into an intellectual black hole will be increasingly unable to think straight, but their inability to think straight will mask from them their inability to think straight. They will continue to think they are thinking just fine, and consequently that we’re the ones whose thinking is screwy.
In fact, our inability to recognize the “truth” of their cult, conspiracy theory, or whatever, may lead them to suppose that it is our minds that have been hobbled, not theirs. If they are religious, they may suppose our thinking has been corrupted by sin, or has fallen under the demonic influence of Satan. Or, because we are not, like them, wearing protective tinfoil hats, that our minds have fallen victim to some sort of alien mind-control technology. That, they may suppose, is why we don’t see the evidence that the Earth is ruled by alien lizard people, while they do.
A sliding scale
Of course, the tinfoil hat brigade are an extreme example. It’s worth noting at the outset that intellectual black holes lie at one end of a sliding scale. The fact is, almost all of us engage in these eight strategies to some extent, particularly when beliefs to which we are strongly committed are faced with a rational threat. And in fact, under certain circumstances, there may be little wrong in using at least some of them in moderation (as I will explain). What transforms a belief system into an intellectual black hole is the extent to which such mechanisms are relied upon in dealing with rational threats and generating an appearance of “reasonableness”. The more we start to rely on these kinds of strategy to prop up and defend our belief system, the more black-hole-like that belief system becomes, until a black hole is clearly what we have got. Even if we have not fallen victim to an intellectual black hole, some of our belief systems may still exhibit an unhealthy reliance on the same strategies.
Religion
This book focuses particularly, though by no means exclusively, on religious examples of intellectual black holes. Why, given there are many non-religious examples from which to choose? My main reason for including many religious examples is that while plenty of belief-systems (e.g. political philosophies such as Marxism, New Age philosophies, belief systems involving dubious or bogus medical treatments, and belief systems centred on grand political conspiracies (such as those involving 9/11) also employ various combinations of these eight mechanisms to ensnare minds, religions typically employ a wider range. Historically, the established religions have had a great deal of time and huge intellectual and other resources to deploy in refining their own particular versions of these strategies. They have, as a result, also produced some of the most powerful and seductive versions. They therefore provide some of the best illustrations. Note also that one of the strategies – Moving the Goalposts – is almost exclusively employed in certain religious circles.
However, while the book contains many religious examples – from Young Earth Creationism to Christian Science – it’s worth emphasizing that I am not arguing that all religious belief-systems are essentially irrational. Several recent books have done that, including books by Christopher Hitchens, Sam Harris and Richard Dawkins. In fact, all three argue that the content of religious belief generally is not just nonsense, but often dangerous nonsense. My aim here is different. It is not the content of religious belief that I criticise, but the manner in which religious belief systems are sometimes defended and promoted.
Actually, any belief-system, including entirely sensible belief-systems, can be defended and promoted in much the same way. To point out that the defenders of a set of beliefs are employing such dubious methods is not yet to show that the content of those beliefs is false. Many of the same strategies can and have also been employed to defend and promote atheistic belief-systems (certain totalitarian regimes provide obvious illustrations).
In certain cases, the beliefs at the centre of an intellectual black hole may be true. In fact, it’s possible that those beliefs could be given an entirely reasonable justification. However, if those defending and promoting those core beliefs can’t come up with a decent justification, and instead rely increasingly on the kind of dodgy strategies described in this book, then their belief system still qualifies as an intellectual black hole. In other words, two belief systems could have at their core the very same set of beliefs, yet one could be an intellectual black hole and the other not.
On bullshit
So, when I talk, as I do, about an intellectual black hole being a bullshit belief system, I’m not suggesting that the content that’s bullshit. Rather, it’s the manner in which the core beliefs are defended and promoted that marks out a belief system as bullshit.
According to the philosopher Harry Frankfurt, whose essay On Bullshit has become a minor philosophical classic, bullshit involves a kind of deliberate fakery. A bullshitter, says Frankfurt, is not the same thing as a liar. The bullshitter does not knowingly tell a fib – he does not assert something he himself knows to be false. Rather the bullshitter just says things to suit his purposes – so as to get away with something – without any care as to whether or not what he says is true.
I don’t entirely agree with Frankfurt’s analysis. Frankfurt’s definition, it seems to me, is in at least one respect too narrow. People regularly talk about astrology, feng shui or Christian Science (discussed in chapter xx), and so on as being bullshit, and their practitioners as bullshit artists, even while acknowledging that those who practice these things typically do so in all sincerity. Not only do the practitioners believe what they say, it really does matter to them that what they say is true. They care about truth.
What nevertheless marks out astrology, feng shui, Christian Science as bullshit is, I’d suggest, is the kind of faux reasonableness that their practitioners generate - the pseudo-scientific gloss that they are able to apply to their core beliefs. They create the illusion that what they believe is reasonable, while not themselves recognizing that they have conjured up only an illusion. They typically fool not only others, but themselves too.
Victims need not be stupid
Those who fall victim to intellectual black holes need be neither dim nor foolish. The sophistication of some of the strategies examined in this book demonstrates that those who develop and use them are often smart. They are, in many cases, ingenious strategies – sometimes very ingenious indeed. Nor need those who fall foul of intellectual black holes be generally gullible. The victims may, in other areas of their lives, be models of caution, subjecting claims to close critical scrutiny, weighing evidence scrupulously, tailoring their beliefs according to robust rational standards They are able, as it were, to compartmentalize their application of these strategies.
So if, after reading this book, you begin to suspect that may yourself have fallen victim to an intellectual black hole, there’s no need to feel particularly foolish. People far cleverer than either you or me have become ensnared.
Why do we believe what we do?
This book examines eight key mechanisms by which a belief system can be transformed into intellectual black hole, into a bullshit system of belief. It doesn’t attempt to explain why we are drawn to particular belief systems in the first place, or why we are often drawn to using kind of mechanisms described in this book in their defence.
Why, for example, is belief in a god or gods, and in other supernatural beings, such as ghosts, angels, dead ancestors, and so on, so widespread? Belief in invisible, supernatural agents appears to be universal, and there is some evidence that a predisposition towards beliefs of this kind may actually be innate – part of our natural, evolutionary heritage. The psychologist Justin Barrett (REF XX), for example, has suggested that the prevalence of beliefs of this kind may in part be explained by our possessing a Hyper-sensitive Agent Detection Device, or H.A.D.D,
The H.A.D.D, Hypothesis
Human beings explain features of the world around them in two very different ways. For example, we sometimes appeal to natural causes or laws in order to account for an event. Why did that apple fall from the tree? Because the wind blew and shook the branch, causing the apple to fall. Why did the water freeze in the pipes last night, because the temperature of the water fell below zero, and it is a law that water freezes below zero.
However, we also explain by appealing to agents – beings who act on the basis of their beliefs and desires in a more or less rational way. Why did the apple fall from the tree? Because Ted wanted to eat it, believed that shaking the tree would make it fall, and so shook the tree. Why are Mary’s car keys on the mantelpiece? Because she wanted to remind herself not to forget them, so put them where she thought would she spot them.
Barrett suggests we have evolved to be overly sensitive to agency. We evolved in an environment containing many agents – family members, friends, rivals, predators, prey, and so on. Spotting and understanding other agents helps us survive and reproduce. So we evolved to be sensitive to them – overly sensitive in fact. Hear a rustle in the bushes behind you and you instinctively spin round, looking for an agent. Most times, there’s no one there – just the wind in the leaves. But, in the environment in which we evolved, on those few occasions when there was an agent present, detecting it might well save your life. Far better to avoid several imaginary predators than be eaten by a real one. Thus evolution will select for an inheritable tendency to not just detect – but over-detect – agency. We have evolved to possess (or, perhaps more plausibly, to be) hyper-active agency detectors.
If we do have an H.A.D.D, that would at least partly explain the human tendency to feel there is “someone there” even when no one is observed, and so may at least partly explain our tendency to believe in the existence of invisible agents – in spirits, ghosts, angels or gods.
For example, in his book Illusion of Conscious Will, Daniel Wegner points out what he believes is the most remarkable characteristic of those using a ouija board (in which the planchette – often an upturned shot glass – appears to wander around the board, spelling out messages from “beyond”):
People using the board seem irresistibly drawn to the conclusion that some sort of unseen agent... is guiding the planchette movement... a theory immediately arises to account for this breakdown: the theory of outside agency." (p.113)
Because the movement nevertheless seems inexplicable and odd, it is immediately put down to the influence of a further, invisible agent.
However, I am not here endorsing the H.A.D.D. explanation for widespread belief in such invisible agents (though I suspect there is some truth to it). The fact is that, even if we do possess an H.A.D.D, that would at best only explain the attractiveness of the content of some of the belief systems we will be examining. Many wacky belief systems - such as those involving crystal healing or palmistry or numerology – appear to involve no invisible agents at all. I mention the H.A.D.D, hypothesis only to illustrate the point that the eight mechanisms identified in this book are not intended to rival such psychological and evolutionary explanations for why we believe what we do. My claim is that once we find ourselves drawn to a belief system, for whatever reason, then these eight mechanisms may come into play to bolster and defend it.
Note that the H.A.D.D, hypothesis does not say that there are no invisible agents. Perhaps at least some of the invisible agents people suppose exist are real. Perhaps there really are ghosts, or spirits, or gods. However, if we suppose the H.A.D.D, hypothesis does correctly explain why it is that so many people believe in the existence of invisible agents exist, then the fact that large numbers hold such beliefs can no longer be considered evidence that any such agents exist. It will no longer do to say, “Surely not all these people can be so very deluded? Surely there must be some truth to these beliefs, otherwise they would not be so widespread?” The fact is, if the H.A.D.D, hypothesis is correct, we are likely to believe in the existence of such invisible agents anyway, whether or not such agents exist. But then the commonality of these beliefs is not evidence such agents exist.
Of course, I don’t deny that there was already good reason to be sceptical about appeals to what many of people believe when it comes to justifying beliefs in invisible agents, as well as many other beliefs of a religious or supernatural character. The fact that around 45% of the citizens of one of the richest and best-educated populations on the planet believe the entire universe is only about six thousand years old is testament to the fact that, whatever else may be said about religion, it undoubtedly possesses a quite astonishing power to get very many people – even smart, college, educated people – to believe downright ridiculous things. Nevertheless, if the H.A.D.D hypothesis is correct, it does add a further nail to the coffin lid to justifications of belief in invisible agents of one sort or another based on the thought: “Lots of people believe it so there must be some truth to it!”
The theory of Cognitive Dissonance
The H.A.D.D, hypothesis may explain why we are drawn to certain belief systems in the first place – those involving invisible agents. Another psychological theory that may play a role in then explaining our propensity to use the kind of strategies described in this book to defend such theories is the theory of cognitive dissonance. Dissonance is the psychological discomfort we feel when we hold beliefs or attitudes that conflict. The theory says that we motivated to reduce dissonance by either adjusting our beliefs and attitudes or rationalizing them.
The example of the “sour grapes” in Aesop’s story of The Fox and The Grapes is often used as an illustration of cognitive dissonance. The fox desires those juicy-looking grapes, but then, when he realizes he will never attain them, he adjusts his belief accordingly to make himself feel better – he decides the grapes are sour.
How might the theory of cognitive dissonance play a role in explaining why we are drawn to using the kind of belief-immunizing strategies described in this book? Here’s an example. Suppose, for the sake of argument, that our evolutionary history has made us innately predisposed towards both belief in supernatural agents, but also towards forming beliefs that are, broadly speaking, rational, or at the very least not downright irrational. That might put us in psychological bind. On the one hand, we may find ourselves unwilling or even unable to give up our belief in certain invisible agents. On the other hand, we may find ourselves confronted by overwhelming evidence that what we believe is pretty silly. Under these circumstances, strategies promising to disarm rational threats to our belief and give it at least the illusion of reasonableness are likely to seem increasingly attractive. Such strategies can provide us with a way of dealing with the intellectual tension and discomfort such innate tendencies might otherwise produce. They allow true believers to reassure themselves that they are not being nearly as irrational as reason might otherwise suggest - to convince themselves and others that their belief in ghosts or spirits or whatever, even if not well-confirmed, is at least not contrary to reason.
So we can speculate about why certain belief systems are attractive, and also why such strategies are employed to immunize them against rational criticism and provide a veneer of “reasonableness”. Both the H.A.D.D. hypothesis and the theory of cognitive dissonance may have a role to play.
INTRODUCTION
Intellectual black holes
Wacky and ridiculous belief systems abound. The Heaven’s Gate suicide cult promised members a ride to heaven on board a UFO. Advanced students of scientology are taught that 75 million years ago, Xenu, alien ruler of a “Galactic Confederacy”, brought billions of people to Earth in spacecraft shaped like Douglas DC-10 airplanes and stacked them around volcanoes which he then blew up with hydrogen bombs. Even mainstream religions have people believing absurdities. Preachers have promised 72 heavenly virgins to suicide bombers. Others insist the entire universe is just 6,000 years old (extraordinarily, polls consistently indicate this belief is currently held by about 45% of US citizens – that’s around 130 million individuals). And of course it’s not only cults and religions that promote bizarre beliefs. Significant numbers of people believe in astrology, the amazing powers of TV psychics, astrology, crystal divination, the healing powers of magnets, the prophecies of Nostradamus, that the pyramids were built by aliens, that the Holocaust never happened, and that the World Trade Centre was brought down by the US Government.
How do such ridiculous views succeed in entrenching themselves in people’s minds? How are wacky belief systems able to take sane, intelligent, college-educated people and turn them into the willing slaves of claptrap? How, in particular, do the true believers manage to convince themselves that they are the rational, reasonable ones and that everyone else is deluded?
This book identifies eight key mechanisms that can transform a set of ideas into a psychological fly trap – a bubble of belief that, while seductively easy to enter, can be almost impossible to reason your way out of again.
Cosmologists talk about black-holes, objects so gravitationally powerful that nothing, not even light, can escape from them. Unwary space travellers passing too close will find themselves sucked in. An increasingly powerful motor is required to resist its pull, until eventually one passes the “event horizon” and escape becomes impossible.
My suggestion is that our contemporary cultural landscape contains, if you like, numerous intellectual black-holes – belief systems constructed in such a way that unwary passers-by can similarly find themselves drawn in. While those of us lacking robust intellectual and other psychological defences will be most easily trapped by such self-sealing bubbles of belief, even the most intelligent and educated of us are potentially vulnerable. Some of the world’s greatest thinkers have fallen in, never to escape. If you find yourself encountering a belief system in which several of these mechanisms feature prominently, be wary. Alarm bells should be going off and warning lights flashing. For you may now be approaching the event horizon of an intellectual black hole.
As the centuries roll by, such self-sealing bubbles of belief appear and disappear. When the conditions are right, several may fizz into existence all at once. Occasionally, one may grow huge, perhaps encompassing an entire civilization before dividing or deflating or popping or being subsumed by another bubble.
One of the greatest threats an intellectual black hole faces is the flourishing of a culture that promotes reason and encourages individuals to subject both their own and others’ beliefs to critical scrutiny - with none deemed off-limits. However, bubbles of irrationality are certainly able to survive and flourish even within such a society. Intellectual black holes are, perhaps, an ineradicable feature of human civilization. Which is not to say we should not do our level best to keep them under control.
Aim of this book
The central aim of this book is to help immunize readers against the wiles of cultists, religious nutcases, political zealots, conspiracy theorists, promoters of flaky alternative medicines, and so on by clearly setting out the tricks of the trade by which such self-sealing bubbles of belief are created and maintained. We will see how an intellectually impregnable fortress can be constructed around a set of even patently ridiculous beliefs, providing them with a veneer of “reasonableness” and rendering them immune to rational criticism.
Most of us will have at some point experienced the frustration and of trying to change the convictions of someone powerfully committed to a ridiculous set of beliefs, and will have come up against at least some of these kinds of strategy. My aim here is to provide an overview of eight key strategies, which I call:
1. Playing the mystery card
2. “But it fits!” and the blunderbus of crap
3. Moving the goal posts
4. Going nuclear
5. ” I just know!”
6. Pseudo-profundity
7. The Amazingly Persuasive Power of Accumulated Anecdote (TAPPAA)
8. Pressing your buttons
In each case I (i) explain the strategy, (ii) diagnose exactly what is wrong with it, and (ii) provide illustrations of how it is applied. We will find out exactly how quacks peddling dubious medicines, cultists promoting nutty dogma, conspiracy theorists who insist we’re ruled by aliens apply these techniques, so you’ll be better able to spot them. Or, if you want to set yourself up as chief guru of your own cult, start your own conspiracy theory, or whatever, how you can employ the techniques in order to dupe others (a the end of the book are a series of letters illustrating just how charlatans can deliberately can and do use them).
Talking to a victim of an intellectual black hole can be a rather creepy experience. The intellectual abilities of such a person are strangely hobbled, but the victim is probably entirely unware that they have been hobbled. Point out to them the flaws in their thinking. Try to show them evidence against what they believe. You’ll find their minds will just continue to slide around it. It’s as if they have a mental blind spot. Not only won’t they “get it” even when “it” has been made as plane as the nose on their face, they will have no inkling that there’s anything they’re missing out on. Victims typically exhibit what has been termed the Dunning-Kruger effect – where a person’s lack of knowledge or expertise in an area not only makes them inadequate, but also keeps them from discovering their own inadequacy. Dunning draws a parallel between the Dunning-Kruger effect and peculiar medical condition called anosognostic paralysis:
An anosognosic patient who is paralyzed simply does not know that he is paralyzed. If you put a pencil in front of them and ask them to pick up the pencil in front of their left hand they won’t do it. And you ask them why, and they’ll say, “Well, I’m tired,” or “I don’t need a pencil.” They literally aren’t alerted to their own paralysis. There is some monitoring system on the right side of the brain that has been damaged, as well as the damage that’s related to the paralysis on the left side. There is also something similar called “hemispatial neglect.” It has to do with a kind of brain damage where people literally cannot see or they can’t pay attention to one side of their environment. If they’re men, they literally only shave one half of their face. And they’re not aware about the other half. If you put food in front of them, they’ll eat half of what’s on the plate and then complain that there’s too little food. You could think of the Dunning-Kruger Effect as a psychological version of this physiological problem. If you have, for lack of a better term, damage to your expertise or imperfection in your knowledge or skill, you’re left literally not knowing that you have that damage.
Someone sucked into an intellectual black hole will be increasingly unable to think straight, but their inability to think straight will mask from them their inability to think straight. They will continue to think they are thinking just fine, and consequently that we’re the ones whose thinking is screwy.
In fact, our inability to recognize the “truth” of their cult, conspiracy theory, or whatever, may lead them to suppose that it is our minds that have been hobbled, not theirs. If they are religious, they may suppose our thinking has been corrupted by sin, or has fallen under the demonic influence of Satan. Or, because we are not, like them, wearing protective tinfoil hats, that our minds have fallen victim to some sort of alien mind-control technology. That, they may suppose, is why we don’t see the evidence that the Earth is ruled by alien lizard people, while they do.
A sliding scale
Of course, the tinfoil hat brigade are an extreme example. It’s worth noting at the outset that intellectual black holes lie at one end of a sliding scale. The fact is, almost all of us engage in these eight strategies to some extent, particularly when beliefs to which we are strongly committed are faced with a rational threat. And in fact, under certain circumstances, there may be little wrong in using at least some of them in moderation (as I will explain). What transforms a belief system into an intellectual black hole is the extent to which such mechanisms are relied upon in dealing with rational threats and generating an appearance of “reasonableness”. The more we start to rely on these kinds of strategy to prop up and defend our belief system, the more black-hole-like that belief system becomes, until a black hole is clearly what we have got. Even if we have not fallen victim to an intellectual black hole, some of our belief systems may still exhibit an unhealthy reliance on the same strategies.
Religion
This book focuses particularly, though by no means exclusively, on religious examples of intellectual black holes. Why, given there are many non-religious examples from which to choose? My main reason for including many religious examples is that while plenty of belief-systems (e.g. political philosophies such as Marxism, New Age philosophies, belief systems involving dubious or bogus medical treatments, and belief systems centred on grand political conspiracies (such as those involving 9/11) also employ various combinations of these eight mechanisms to ensnare minds, religions typically employ a wider range. Historically, the established religions have had a great deal of time and huge intellectual and other resources to deploy in refining their own particular versions of these strategies. They have, as a result, also produced some of the most powerful and seductive versions. They therefore provide some of the best illustrations. Note also that one of the strategies – Moving the Goalposts – is almost exclusively employed in certain religious circles.
However, while the book contains many religious examples – from Young Earth Creationism to Christian Science – it’s worth emphasizing that I am not arguing that all religious belief-systems are essentially irrational. Several recent books have done that, including books by Christopher Hitchens, Sam Harris and Richard Dawkins. In fact, all three argue that the content of religious belief generally is not just nonsense, but often dangerous nonsense. My aim here is different. It is not the content of religious belief that I criticise, but the manner in which religious belief systems are sometimes defended and promoted.
Actually, any belief-system, including entirely sensible belief-systems, can be defended and promoted in much the same way. To point out that the defenders of a set of beliefs are employing such dubious methods is not yet to show that the content of those beliefs is false. Many of the same strategies can and have also been employed to defend and promote atheistic belief-systems (certain totalitarian regimes provide obvious illustrations).
In certain cases, the beliefs at the centre of an intellectual black hole may be true. In fact, it’s possible that those beliefs could be given an entirely reasonable justification. However, if those defending and promoting those core beliefs can’t come up with a decent justification, and instead rely increasingly on the kind of dodgy strategies described in this book, then their belief system still qualifies as an intellectual black hole. In other words, two belief systems could have at their core the very same set of beliefs, yet one could be an intellectual black hole and the other not.
On bullshit
So, when I talk, as I do, about an intellectual black hole being a bullshit belief system, I’m not suggesting that the content that’s bullshit. Rather, it’s the manner in which the core beliefs are defended and promoted that marks out a belief system as bullshit.
According to the philosopher Harry Frankfurt, whose essay On Bullshit has become a minor philosophical classic, bullshit involves a kind of deliberate fakery. A bullshitter, says Frankfurt, is not the same thing as a liar. The bullshitter does not knowingly tell a fib – he does not assert something he himself knows to be false. Rather the bullshitter just says things to suit his purposes – so as to get away with something – without any care as to whether or not what he says is true.
I don’t entirely agree with Frankfurt’s analysis. Frankfurt’s definition, it seems to me, is in at least one respect too narrow. People regularly talk about astrology, feng shui or Christian Science (discussed in chapter xx), and so on as being bullshit, and their practitioners as bullshit artists, even while acknowledging that those who practice these things typically do so in all sincerity. Not only do the practitioners believe what they say, it really does matter to them that what they say is true. They care about truth.
What nevertheless marks out astrology, feng shui, Christian Science as bullshit is, I’d suggest, is the kind of faux reasonableness that their practitioners generate - the pseudo-scientific gloss that they are able to apply to their core beliefs. They create the illusion that what they believe is reasonable, while not themselves recognizing that they have conjured up only an illusion. They typically fool not only others, but themselves too.
Victims need not be stupid
Those who fall victim to intellectual black holes need be neither dim nor foolish. The sophistication of some of the strategies examined in this book demonstrates that those who develop and use them are often smart. They are, in many cases, ingenious strategies – sometimes very ingenious indeed. Nor need those who fall foul of intellectual black holes be generally gullible. The victims may, in other areas of their lives, be models of caution, subjecting claims to close critical scrutiny, weighing evidence scrupulously, tailoring their beliefs according to robust rational standards They are able, as it were, to compartmentalize their application of these strategies.
So if, after reading this book, you begin to suspect that may yourself have fallen victim to an intellectual black hole, there’s no need to feel particularly foolish. People far cleverer than either you or me have become ensnared.
Why do we believe what we do?
This book examines eight key mechanisms by which a belief system can be transformed into intellectual black hole, into a bullshit system of belief. It doesn’t attempt to explain why we are drawn to particular belief systems in the first place, or why we are often drawn to using kind of mechanisms described in this book in their defence.
Why, for example, is belief in a god or gods, and in other supernatural beings, such as ghosts, angels, dead ancestors, and so on, so widespread? Belief in invisible, supernatural agents appears to be universal, and there is some evidence that a predisposition towards beliefs of this kind may actually be innate – part of our natural, evolutionary heritage. The psychologist Justin Barrett (REF XX), for example, has suggested that the prevalence of beliefs of this kind may in part be explained by our possessing a Hyper-sensitive Agent Detection Device, or H.A.D.D,
The H.A.D.D, Hypothesis
Human beings explain features of the world around them in two very different ways. For example, we sometimes appeal to natural causes or laws in order to account for an event. Why did that apple fall from the tree? Because the wind blew and shook the branch, causing the apple to fall. Why did the water freeze in the pipes last night, because the temperature of the water fell below zero, and it is a law that water freezes below zero.
However, we also explain by appealing to agents – beings who act on the basis of their beliefs and desires in a more or less rational way. Why did the apple fall from the tree? Because Ted wanted to eat it, believed that shaking the tree would make it fall, and so shook the tree. Why are Mary’s car keys on the mantelpiece? Because she wanted to remind herself not to forget them, so put them where she thought would she spot them.
Barrett suggests we have evolved to be overly sensitive to agency. We evolved in an environment containing many agents – family members, friends, rivals, predators, prey, and so on. Spotting and understanding other agents helps us survive and reproduce. So we evolved to be sensitive to them – overly sensitive in fact. Hear a rustle in the bushes behind you and you instinctively spin round, looking for an agent. Most times, there’s no one there – just the wind in the leaves. But, in the environment in which we evolved, on those few occasions when there was an agent present, detecting it might well save your life. Far better to avoid several imaginary predators than be eaten by a real one. Thus evolution will select for an inheritable tendency to not just detect – but over-detect – agency. We have evolved to possess (or, perhaps more plausibly, to be) hyper-active agency detectors.
If we do have an H.A.D.D, that would at least partly explain the human tendency to feel there is “someone there” even when no one is observed, and so may at least partly explain our tendency to believe in the existence of invisible agents – in spirits, ghosts, angels or gods.
For example, in his book Illusion of Conscious Will, Daniel Wegner points out what he believes is the most remarkable characteristic of those using a ouija board (in which the planchette – often an upturned shot glass – appears to wander around the board, spelling out messages from “beyond”):
People using the board seem irresistibly drawn to the conclusion that some sort of unseen agent... is guiding the planchette movement... a theory immediately arises to account for this breakdown: the theory of outside agency." (p.113)
Because the movement nevertheless seems inexplicable and odd, it is immediately put down to the influence of a further, invisible agent.
However, I am not here endorsing the H.A.D.D. explanation for widespread belief in such invisible agents (though I suspect there is some truth to it). The fact is that, even if we do possess an H.A.D.D, that would at best only explain the attractiveness of the content of some of the belief systems we will be examining. Many wacky belief systems - such as those involving crystal healing or palmistry or numerology – appear to involve no invisible agents at all. I mention the H.A.D.D, hypothesis only to illustrate the point that the eight mechanisms identified in this book are not intended to rival such psychological and evolutionary explanations for why we believe what we do. My claim is that once we find ourselves drawn to a belief system, for whatever reason, then these eight mechanisms may come into play to bolster and defend it.
Note that the H.A.D.D, hypothesis does not say that there are no invisible agents. Perhaps at least some of the invisible agents people suppose exist are real. Perhaps there really are ghosts, or spirits, or gods. However, if we suppose the H.A.D.D, hypothesis does correctly explain why it is that so many people believe in the existence of invisible agents exist, then the fact that large numbers hold such beliefs can no longer be considered evidence that any such agents exist. It will no longer do to say, “Surely not all these people can be so very deluded? Surely there must be some truth to these beliefs, otherwise they would not be so widespread?” The fact is, if the H.A.D.D, hypothesis is correct, we are likely to believe in the existence of such invisible agents anyway, whether or not such agents exist. But then the commonality of these beliefs is not evidence such agents exist.
Of course, I don’t deny that there was already good reason to be sceptical about appeals to what many of people believe when it comes to justifying beliefs in invisible agents, as well as many other beliefs of a religious or supernatural character. The fact that around 45% of the citizens of one of the richest and best-educated populations on the planet believe the entire universe is only about six thousand years old is testament to the fact that, whatever else may be said about religion, it undoubtedly possesses a quite astonishing power to get very many people – even smart, college, educated people – to believe downright ridiculous things. Nevertheless, if the H.A.D.D hypothesis is correct, it does add a further nail to the coffin lid to justifications of belief in invisible agents of one sort or another based on the thought: “Lots of people believe it so there must be some truth to it!”
The theory of Cognitive Dissonance
The H.A.D.D, hypothesis may explain why we are drawn to certain belief systems in the first place – those involving invisible agents. Another psychological theory that may play a role in then explaining our propensity to use the kind of strategies described in this book to defend such theories is the theory of cognitive dissonance. Dissonance is the psychological discomfort we feel when we hold beliefs or attitudes that conflict. The theory says that we motivated to reduce dissonance by either adjusting our beliefs and attitudes or rationalizing them.
The example of the “sour grapes” in Aesop’s story of The Fox and The Grapes is often used as an illustration of cognitive dissonance. The fox desires those juicy-looking grapes, but then, when he realizes he will never attain them, he adjusts his belief accordingly to make himself feel better – he decides the grapes are sour.
How might the theory of cognitive dissonance play a role in explaining why we are drawn to using the kind of belief-immunizing strategies described in this book? Here’s an example. Suppose, for the sake of argument, that our evolutionary history has made us innately predisposed towards both belief in supernatural agents, but also towards forming beliefs that are, broadly speaking, rational, or at the very least not downright irrational. That might put us in psychological bind. On the one hand, we may find ourselves unwilling or even unable to give up our belief in certain invisible agents. On the other hand, we may find ourselves confronted by overwhelming evidence that what we believe is pretty silly. Under these circumstances, strategies promising to disarm rational threats to our belief and give it at least the illusion of reasonableness are likely to seem increasingly attractive. Such strategies can provide us with a way of dealing with the intellectual tension and discomfort such innate tendencies might otherwise produce. They allow true believers to reassure themselves that they are not being nearly as irrational as reason might otherwise suggest - to convince themselves and others that their belief in ghosts or spirits or whatever, even if not well-confirmed, is at least not contrary to reason.
So we can speculate about why certain belief systems are attractive, and also why such strategies are employed to immunize them against rational criticism and provide a veneer of “reasonableness”. Both the H.A.D.D. hypothesis and the theory of cognitive dissonance may have a role to play.
Monday, 9 August 2010
Logo Competition
Logo Competition.
We only received three entries for the new CFI logo competition, and, on reflection, the judges decided none were quite right. However, because we very much appreciated the effort that these three individuals made we are giving them each two year's free membership of CFI UK which gives them free entry to all Conway Hall events (which I am just organizing now).
Can the three contributors email me their postal addresses - thanks.
We only received three entries for the new CFI logo competition, and, on reflection, the judges decided none were quite right. However, because we very much appreciated the effort that these three individuals made we are giving them each two year's free membership of CFI UK which gives them free entry to all Conway Hall events (which I am just organizing now).
Can the three contributors email me their postal addresses - thanks.
Saturday, 7 August 2010
Jazz drumming Sunday evening
I am taking my drum kit down to the Old Anchor pub in Abingdon (Oxfordshire) (St Helen's Wharf, OX14 5EN) for a jazz jam tomorrow night, Sunday August 8th, 8.30pm onwards. Can't say what it will be like, but should be fun.
I promise there will actually be some philosophy on this blog shortly...
I promise there will actually be some philosophy on this blog shortly...
Jesus paper published
The journal Faith and Philosophy have accepted my piece on "Miracles, Evidence and The Existence of Jesus", which evolved from discussions on this blog. So thanks for all your comments, provocations, etc.
I will put the final version up here eventually.
That's three papers in Philosophy of Religion now published. "The Evil God Challenge" has just been published in Religious Studies. "Plantinga's Belief-Cum-Desire Argument Refuted" appears in Religious Studies shortly.
If you want a copy of any of these, let me know...
I will put the final version up here eventually.
That's three papers in Philosophy of Religion now published. "The Evil God Challenge" has just been published in Religious Studies. "Plantinga's Belief-Cum-Desire Argument Refuted" appears in Religious Studies shortly.
If you want a copy of any of these, let me know...
Subscribe to:
Posts (Atom)