PRESSING YOUR BUTTONS
One way in which we can shape the beliefs of others is by rational persuasion. Suppose, for example, that I want someone to believe that Buckingham Palace is in London (which it is). I could provide them with a great deal of evidence to support that belief. I could also just take them to London so they can see with their own eyes that that’s where Buckingham Palace is located.
But what if these kinds of method aren’t available? Suppose I have little or no evidence to support the belief I nevertheless want people to accept. Suppose I can’t just show them that it’s true. How else might I get them to believe?
I might try to dupe them, of course. I could produce fraudulent evidence and bogus arguments. But what if I suspect this won’t be enough? What if I think my deceit is likely to be detected? Another option is to drop even the pretence of rational persuasion and to adopt what I call Pressing your Buttons.
Belief-shaping mechanisms
All sorts of causal mechanisms can be used to shape belief. For example, our beliefs are shaped by social and psychological mechanisms such as peer pressure and a desire to conform. Finding ourselves believing something of which our community disapproves is a deeply uncomfortable experience, an experience that may lead us unconsciously to tailor what we believe so that we remain in step with them. We’re far more susceptible to such social pressures than we like to believe (as several famous psychological studies have shown[i]).
Belief can also be shaped through the use of reward and punishment. A grandmother may influence the beliefs of her grandson by giving him a sweet whenever he expresses the kind of beliefs of which she approves, and ignores or smacks him when he expresses the “wrong” sort of belief. Over time, this may change not just the kind of beliefs her grandson expresses, but also the kinds of belief he holds.
Perhaps beliefs might also be directly implanted in us. Some suppose God has implanted certain beliefs in at least some of us. Our evolutionary history may also produce certain beliefs, or at least certain predispositions to belief. For example, there’s growing evidence that a disposition towards religious belief is part of our evolutionary heritage, bestowed on us by natural selection. But even if neither God, nor evolution, has implanted beliefs in us, perhaps we’ll one day be able to implant beliefs ourselves using technology. Perhaps we’ll be able to strap a brain-state-altering helmet on to an unwitting victim while they sleep, dial in the required belief, press the red button and “Bing!”, our victim wakes up with the belief we’ve programmed them hold. That would be a rather cruel trick. Some hypnotists claim a similar ability to, as it were, directly “inject” beliefs into people’s minds.
Obviously, these kinds of causal mechanism can operate on us without our realizing what’s going on. I might think I condemn racism because I have good grounds for supposing racism is morally wrong, but the truth is I have merely caved into peer pressure and my desire not to be ostracised by my liberal family and friends. If a belief has been implanted in me by, say, natural selection, or by some brain-state-altering device then, again, I may not be aware that this is the reason why I believe. Suppose, for example, that some prankster to programmes me to believe I have been abducted by aliens using the belief-inducing helmet described above. I wake up one morning and find, as a result, that I now very strongly believe I was taken aboard a flying saucer during the night. I have no awareness of the real reason why I now hold that belief – of the mechanism that actually produced the belief in me. If asked how I know I was abducted, I will probably say “I Just Know!”
Isolation, control, uncertainty, repetition, emotion
I’m going to focus here on five important belief-shaping mechanisms: isolation, control, uncertainty, repetitionand emotion.
(i) isolation. Isolation is a useful belief-shaping tool. An isolated individual is more vulnerable to various forms of psychological manipulation. If you want someone to believe something that runs contrary to what their friends and family believe, it’s a good idea to have them spend some time at a retreat or remote training camp where their attachment to other ideas can more easily be undermined. Cults often isolate their members in this way. The The cult leader Jim Jones physically moved both himself and all his followers to the Guyanan jungle (where they all eventually committed suicide). Isolation is also recommended by some within more mainstream religions. In the UK, hermetically sealed-off religious schools are not uncommon. Students at the Tarbiyah Academy in Dewsbury, for example, are allegedly taught that
‘the enemies of Allah’ have schemed to poison the thinking and minds of [Muslim] youth and to plant the spirit of unsteadiness and moral depravity in their lives. Parents are told that they betray their children if they allow them to befriend non-Muslims.[ii]
A related mechanism is:
(ii) control. If you want people to accept your belief system, it’s unwise to expose them to alternative systems of belief. Gain control over the kind of ideas to which they have access and to which they are exposed. Censor beliefs and ideas that threaten to undermine your own. This kind of control is often justified on the grounds that people will otherwise be corrupted or confused. Totalitarian regimes will often remove “unhealthy” books from their libraries if the books contradict the regime. All sorts of media are restricted on the grounds that they will only “mislead” people. Schools under totalitarian regimes will sometimes justify preventing children from discovering or exploring other points of view on the grounds they will only succeed in “muddling” children. Take a leaf out of the manuals of such regimes and restrict your followers’ field of vision so that everything is interpreted through a single ideological lens – your own.
(iii) uncertainty. If you want people to abandon their former beliefs and embrace your own, or if you want to be sure they won’t reject your beliefs in favour of others, it helps to raise as much doubt and uncertainty as possible about those rival beliefs. Uncertainty is a potent source of stress, so the more you associate alternative beliefs with uncertainty, the better. Ideally, offer a simple set of geometric, easily formulated and remembered certainties designed to give meaning to and cover every aspect of life. By constantly harping on the vagaries, uncertainties and meaninglessness of life outside your belief system, the simple, concrete certainties you offer may begin to seem increasingly attractive to your audience.
(iv) repetition. Encourage repetition. Get people to recite what you want them to believe over and over again in a mantra-like way. Make the beliefs trip unthinkingly off their tongues. It doesn’t matter whether your subjects accept what they are saying, or even fully understand it, to begin with. There’s still a fair chance that belief will eventually take hold. Mindless repetition works especially well when applied in situations in which your subjects feel powerful pressure to confirm. Lining pupils up in playgrounds for a daily, mantra-like recitation of your key tenets, for example, combines repetition with a situation in which any deviation by an individual will immediately result in a hundred pairs of eyes turned in their direction.
(v) emotion. Emotion can be harnessed to shape belief. Fear is particularly useful. In George Orwell’s novel Nineteen Eighty-Four, the regime seeks control not just over people’s behaviour, but, even more importantly, what they think and feel. When the hapless rebel Winston is finally captured, his ”educators” make it clear that what ultimately concerns them are his thoughts:
“And why do you imagine that we bring people to this place?”
“To make them confess.”
“No, that is not the reason. Try again.”
“To punish them.”
“No!” exclaimed O’Brien. His voice had changed extraordinarily, and his face had suddenly become both stern and animated. “No! Not merely to extract your confession, not to punish you. Shall I tell you why we have brought you here? To cure you! To make you sane! Will you understand, Winston, that no one whom we bring to this place ever leaves our hands uncured? We are not interested in those stupid crimes that you have committed. The Party is not interested in the overt act: the thought is all we care about.[iii]
The terrifying contents of Room 101 eventually cause Winston to succumb. He ends up genuinely believing that if Big Brother says that 2 plus 2 equals five, then two plus two does equal five. Many real regimes have been prepared to employ similarly brutal methods to control what is going on in people’s minds. However, emotional manipulation can take much milder forms yet still be effective. For example, you might harness the emotional power of iconic music and imagery. Ensure people are regularly confronted by portraits of Our Leader accompanied by smiling children and sunbeams emanating from his head (those Baghdad murals of Saddam Hussein spring to mind). Ensure your opponents and critics are always portrayed accompanied by images of catastrophe and suffering, or even Hieronymus-Bosch-like visions of hell. Make people emotional dependent on your own belief system. Ensure that what self-esteem and sense of meaning, purpose and belonging they have is derived as far as possible from their belonging to your system of belief. Make sure they recognise that abandoning that belief system will involve the loss of things about which they care deeply.
It goes without saying that these five mechanisms of thought-control are popular with various totalitarian regimes. They are also a staple of many extreme religious cults.
Applied determinedly and systematically, these mechanisms can be highly effective in shaping belief and suppressing “unacceptable” lines of thought. They are particularly potent when applied to children and young adults, whose critical defences are weak, and who have a sponge-like tendency to accept whatever they are told.
Note that traditional mainstream religious education has sometimes also involved heavy reliance on many, sometimes all, of these five mechanisms. I was struck by a story a colleague once told me that, as a teenage pupil of rather strict Catholic in the 1960’s, she once put her hand up in class to ask why contraception was wrong. She was immediately sent to the headmaster who asked her why she was obsessed with sex. Interestingly, my colleague added that, even before she asked the question, she knew she shouldn’t. While never explicitly saying so, her school and wider Catholic community had managed to convey to her that asking such a question was unacceptable. Her role was not to think and question, but to passively accept. My colleague added that, even today, nearly half a century later later, despite the fact that she no longer has any religious conviction, she finds herself feeling guilty if she dares to question a Catholic belief. So effective was her religious upbringing in straight-jacketing her thinking that she still feels instinctively that to do so is to commit a thought-crime.
Of course, religious education doesn’t have to be like this, and often it isn’t. An open, questioning attitude can be encouraged rather than suppressed. Still, it’s clear that some mainstream religions have historically been very reliant upon such techniques so far as the transmission of the faith from one generation to the next is concerned. In some places, they still are.
Brainwashing
Applied in a consistent and systematic fashion these various techniques add up to what many would call “brainwashing”. Kathleen Taylor, a research scientist in physiology at the University of Oxford, upon whose work I am partly drawing here, has published a book on brainwashing. In an associated newspaper article, Taylor writes that:
One striking fact about brainwashing is its consistency. Whether the context is a prisoner of war camp, a cult’s headquarters or a radical mosque, five core techniques keep cropping up: isolation, control, uncertainty, repetition and emotional manipulation.[iv]
Taylor adds in her book that within the discipline of psychology, “brainwashing” is an increasingly superfluous word. It can be a misleading term, associated as it is, with Manchurian-Candidate-type stories of seemingly ordinary members of the public transformed into presidential, assassins on hearing a trigger phrase. As Taylor says, that kind of brainwashing is a myth. Case studies suggest there is
no “magic” process called “brainwashing”, though many (including the U.S. government) have spent time and money looking for such a process. Rather the studies suggest that brainwashing… is best regarded as a collective noun for various, increasingly well-understood techniques of non-consensual mind-change.
The unwitting and well-intentioned brainwasher
Often, those who use such techniques are despicable people with the evil aim of enslaving minds. Edward Hunter, the CIA operative who coined the phrase back in 1950, characterized brainwashing in emotive terms:
The intent is to change a mind radically so that its owner becomes a living puppet – a human robot – without the atrocity being visible from the outside. The aim is to create a mechanism in flesh and blood, with new beliefs and new thought processes inserted into a captive body. What that amounts to is the search for a slave race that, unlike the slaves of olden times, can be trusted never to revolt, always to be amenable to orders, like an insect to its instincts.
Perhaps this very often was the intent so far as the regimes of which Hunter had experience were concerned. However, surely the intent to produce mental slaves is not required for brainwashing. Sometimes those who apply these techniques genuinely believe themselves to be doing good. Their intention is not to enslave but to free their victims from evil and illusion. Yet, despite the absence of any evil intent, heavy reliance on such techniques still adds up to brainwashing. Brainwashers can be good people with little or no awareness that what they are engaged in is brainwashing.
The consenting victim
In the second Taylor quotation above, Taylor says that brainwashing involves various techniques of non-consensual mind-change. That cannot be quite right. Of course, prisoners-of-war don’t usually consent to being brainwashed. But people can in principle consent. In one well-known thriller, the trained assassin at the heart of the film turns out to have agreed to be brainwashed. The fact that he consented to have such techniques applied to him doesn’t entail that he wasn’t brainwashed.
People sometimes willingly submit themselves to brainwashing. They sign up to be brainwashed at a cult’s training camp, say. Admittedly, they will not usually describe what they have signed up to as “brainwashing”. As they see it, even while they are fully aware that the above techniques will be applied to them, they nevertheless suppose they are merely being “educated” - being put through a process that will open up their minds and allow them to see the truth.
Also notice that people are sometimes forcibly confronted with the truth. I might be forced to look at compelling evidence that someone I love has done some terrible deed, evidence that does convince me that they’re guilty. So not only is not all brainwashing non-consensual, not all non-consensual mind-change is brainwashing.
Reason vs. brainwashing
So what is brainwashing, then? What marks it out from other belief-shaping mechanisms? At this point, some readers might be wondering whether what I am calling “brainwashing” is really any different to any other educational method. Isn’t the application of reason to persuade really just another form of thought-control? Just another way of wielding power over the minds of others? So why shouldn’t we favour brainwashing over reason? Particularly if no one is actually being coerced, threatened or harmed?
In fact, there’s at least one very obvious and important difference between the use of reason and the use of these kinds of belief-shaping techniques. Reason is truth-sensitive. It favours true beliefs over false beliefs. Trying making a rational case for believing that New Jersey is populated with ant-people or that the Earth’s core is made of yoghurt. Because these beliefs are false, you’re not going to find it easy.
Reason functions, in effect, as a filter on false beliefs. It’s not one hundred percent reliable of course – false beliefs can still get through. But it does tend to weed out false beliefs. There are innumerable beliefs out there that might end up lodging in your head, from the belief that Paris is the capital of France to the belief that the Earth is ruled by alien lizard-people. Apply your filter of reason, and only those with a fair chance of being true will get through. Turn your filter off, and your head will soon fill up with nonsense.
And yet many belief systems do demand that we turn our filters off, at least when it comes to their own particular beliefs. In fact, those who turn their filters off – those whose minds have become entirely passive receptacles of the faith – are often held up by such belief-systems as a shining example to others. Mindless, uncritical acceptance (or, as they would see it, a simple, trusting faith in the pronouncements of Big Brother) is paraded as a badge of honour.
Reason is a double-edged sword. It does not favour the beliefs of the “educator” over those of the “pupil”. It favours those beliefs that are true. This means that if you try to use reason to try to bring others round to your way of thinking, you run the risk that they may be able to demonstrate that it is actually you that’s mistaken. That’s a risk that some “educators” aren’t prepared to take.
The contrast between the use of reason to persuade, and the use of the kind of belief-shaping mechanisms outlined above, is obvious. You can use emotional manipulation, peer pressure, censorship and so on to induce beliefs that happen to be true. But they can be just effectively used to induce belief that Big Brother loves you, that there are fairies at the bottom of the garden and that the Earth’s core is made of yoghurt. Such techniques do indeed favour the beliefs of the “educator” over those of the “pupil”. Which is precisely why those “educators” who suspect they may end up losing the argument tend to favour them.
I call the application of such non-truth-sensitive belief-inducing techniques – techniques that don’t require even the pretence of rational persuasion – Pressing Your Buttons. Brainwashing involves the systematic and dedicated application of such button-pressing techniques.
Of course, to some extent, we can’t avoid pressing the buttons of others. Nor can we entirely avoid having our own buttons pressed. That fact is, we all have our beliefs shaped by such non-truth sensitive mechanisms. No doubt we flatter ourselves about just how “rational” we really are. And, like it or not, you will inevitably influence the beliefs of others by non-truth-sensitive means.
For example, my own children’s beliefs are undoubtedly shaped by the kind of peer group to which I introduce them, by their desire to want to please (or perhaps annoy) me, by the range of different beliefs to which I have given them access at home, and so on. But of course that’s not yet to say I’m guilty of brainwashing my children. The extent to which we shape the beliefs of other by pressing their buttons, rather than relying on rational means, is a matter of degree. There’s a sliding scale of reliance on non-truth-sensitive mechanisms, with brainwashing located at the far end of the scale. There’s clearly a world of difference between, on the one hand, the parent who tries to give their child access to a wide range of religious and political points of views, encourages their child to think, question, and value reason, and allows their child to befriend children with different beliefs and, on the other hand, the parent who deliberately isolates their child, ensures their child has access only to ideas of which the parent approves, demands formal recitation of certain beliefs, allows their child to befriend children who share the same beliefs, and so on.
The dehumanizing effect of button-pressing
So one key difference between relying on reason to influence the beliefs of others and relying on button pressing is that only the former is sensitive to truth. Button pressing can as easily be used to induce false or even downright ridiculous beliefs as it can true beliefs.
There is also a second important difference worth noting. As the philosopher Kant noted, when you rely on reason to try to influence the beliefs of others, you respect their freedom to make (or fail to make) a rational decision. When you resort to pressing their buttons on the other hand, you are, in effect, stripping them of that freedom. Your subject might think they’ve made a free and rational decision, but the truth is they’re your puppet – you’re pulling their strings. By resorting to button-pressing – peer pressure, emotional manipulation, repetition, and so on – you are, in effect, treating them as just one more bit of the causally-manipulatable natural order – as mere things. The button-pressing approach is, in essence, a dehumanizing approach.
Conclusion
Clearly, a cult that employs full-blown brainwashing at a training camp is a cause for concern. If the beliefs it induces are pernicious – if, for example, followers are being lured into terrorism – then obviously we should alarmed. However, even if the beliefs induced happen to be benign, there’s still cause for concern.
One reason we should be concerned is the potential hazard such mindless and uncritical followers pose. They may as well have cotton wool in their ears so far as the ideas and arguments of non-believers are concerned. They are immune to reason. Trapped inside an Intellectual Black Hole, they are now largely at the mercy of those who control the ideas at its core. The dangers are obvious.
Such extreme examples of brainwashing are comparatively rare. Still, even if not engaged in full-blown brainwashing, if the promoters of belief system come increasingly to rely on button-pressing to shape the beliefs of others, that too is a cause for concern. The more we rely on button-pressing, the less sensitive to reason and truth our beliefs become.
[i]Solomon Asch’s conformity experiments revealed people are prone to denying the evidence of their own eyes if it brings them into disagreement with others (though admittedly this is not quite the same thing as changing what one believes in order to conform). See Asch, S. E. “Effects Of Group Pressure Upon The Modification And Distortion Of Judgment” in H. Guetzkow (ed.) Groups, Leadership And Men (Pittsburgh, PA: Carnegie Press, 1951).
[ii]The Times, 20th July 2005, p. 25.
[iii]George Orwell, Nineteen Eighty-Four(Harmondsworth: Penguin, 1954), p. 265
[iv]Kathleen Taylor, “Thought Crime” The Guardian, 8th October 2005, p. 23.
No comments:
Post a Comment