Hunting the unexplained: the importance of “investigating” strange phenomena

The other night I became embroiled in a conversation sparked by a fellow skeptic’s comment: “hunting” big cats, monsters, ghosts and investigating their sightings is utterly pointless. The psychology behind such beliefs isn’t”

It’s an opinion that I can understand and, to some extent, was sympathetic to a few months ago. However, I did change my mind and so here’s why I think ‘investigating’ these phenomena* is worthwhile.

*Please note I am using the scientific definition of phenomena – i.e. ‘any observable occurrence’.

The shortest reason that I disagree with this is the difficulty of knowing what caused something if there’s no investigation. How are we to learn about phenomena that cause seemingly odd things to happen and therefore the reason for the belief – the psychology behind it? Was it illusion, delusion or hoax?

Any investigation is difficult if you don’t know the circumstances.

What does paranormal really mean?

According to Tobacyk (1988), Paranormal phenomena are those which, if genuine, would violate basic limiting principles of science.

The view that some skeptics take is that, “given we know that ghosts are impossible, there’s little point in investigating what will either turn out to be a hoax, an illusion, a delusion or just a simple misinterpretation.”

“‘Investigating’ these phenomena gives credibility to the idea that the paranormal phenomena is a real one and is a waste of time, effort and money.”

I believe investigating the paranormal does not always lend itself to legitimising the claim; it often shows how reason can make more sense than [the] supernatural.

Dismissing engagement with believers in the paranormal seems to me to be on a par with dismissing engaging with the religious/believers in alternative medicine (especially if it’s because you don’t think it’s worth it: it’s pointless because they’re stupid or won’t listen).

I firmly believe that encouraging logical and rational thinking on any subject is worth it. Arguably, it’s easier to convince someone that what they thought was a ghost actually had a normal, mundane and perfectly reasonable cause. Their flickering lights can be explained by faulty electrical wiring or a spooky sound caused by the house cooling in the cold winter night – than tell them God isn’t real.

It’s harder to show someone that the medicine effect they think they feel is actually explainable by the placebo effect and not a true effect from the homeopathy they tried.

You can, sometimes, actually reproduce the effects of the ‘ghost’ by demonstrating how that ghost materialised or by testing that piece of fur and finding out whether it was from a domestic pet or Bigfoot. And I think it’s worth it.

Tumbling the blocks of irrationality: an edifice of false belief

Jenga blocks toppling
Toppling the edifice of false belief

If you knock one false belief then sometimes the others come tumbling down too, having been dislodged. I see this as a game of Jenga – if you pull out the right block then you topple the individual bricks that comprise the ‘edifice of false belief’. Teaching critical thinking is a gift – not only will you show someone they need not be frightened of the horrible scream (that was a barn owl or a fox**) and potentially put their mind at rest, but you’ll give them a way to examine, discover and analyse the world.

Critical thinking is a tool not only for examining paranormal claims, but for life in general. A tendency to gullibility can lead to all kinds of trouble, from being scammed out of your hard-earned cash by a doorstepping confidence trickster to giving money to someone claiming to have found your long-lost inheritance. Or just checking whether the word gullible really is in the dictionary or not.

The first time I heard a vixen calling its mate I was convinced someone had left a screaming baby on my doorstep and the second time I heard it that night I thought the fictional mother of the fictional baby was being attacked. Terrifying stuff. Now when I hear a fox it’s still creepy, but I’m not in fear.

The trouble is, humans are natural pattern seekers. Our tendency to see faces or notice things is an evolutionary benefit that leads to what are called situational or external ‘attribution errors’: we attribute something that has caught our attention or frightened us to ‘the paranormal’ or something supernatural. This is also sometimes called the god-of-the-gaps argument.

Belief in the paranormal: is it ubiquitous?

Not as ubiquitous as an ‘availability error’✥ might suggest to a skeptic, but there’s definitely a significant amount of people with a belief in the paranormal.

✥ determining probability “by the first thing that comes to mind” (Sutherland 1992: p. 11)

In 2001, Margaret Hamilton found that only 25% of her participants confirmed they had absolutely ‘no belief’ in astrology, despite general skepticism in her sample. The Gallup organisation in the US polled citizens on their beliefs and found that 73% of their respondents believed in at least one of these ten paranormal items: (extrasensory perception; 41%, haunted houses, 37%, ghosts, 32%, telepathy, 31%, clairvoyance, 26%, astrology, 25%, communication with the deceased, 21%, witches, 21%, reincarnation, 20% and channeling spirits, 9% (source: Wikipedia).

An online Australian survey in 2006 polled 2000 people worldwide and found 70% believed they’d experienced a life-changing paranormal event like being touched by an entity that they felt they knew was not really there. Furthermore, 80% reported having had at least one premonition and 50% believed not only they’d had a previous life, but that they could remember it. Subsequent studies in the US at Oklahoma University in 2006 have found consistent results (source: Wikipedia).

So these people must be stupid, right?

face palm: a boy puts his face on a palm tree: image caption - face palm, you're doing it wrong

“Wasting time demonstrating to a ghost believer why her electrical hum isn’t a ghost is pointless [in my opinion]”

“To get to the point of belief, they are already actively ignoring overwhelming evidence to the contrary. After that, it’s idealism”

– two comments addressing this subject in a recent discussion

What a piece of work is a man, how noble in reason, how infinite in faculties… (Shakespeare’s Hamlet, Act 2, scene 2, 303–312)

The fact is, as I mentioned earlier, we’re all subject to irrational beliefs and seeing things that aren’t there. Hamlet was wrong. We’re anything BUT noble in reason and infinite in faculties.

stuffthatlookslikejesus.com – click here for pareidoLOLs

 Pareidolia is possibly one of the skeptics’ favourite phenomena. From seeing religious imagery  like Jesus in a dog’s rear end (was actually an edited image) to the virgin Mary in a cheese toasted sandwich or even a house that looks like Hitler, humans can recognise patterns and faces automatically and instantly. Some pioneering people have even made software that can do it for us. Guess what? Even machines experience pareidolia too.

Irrational or ignorant? Heuristics, judgement and biases

“[ignorance] can only be assessed in the light of what a person knows” 

“What constitutes a rational decision depends upon one’s knowledge. If one has reason to believe one’s knowledge is insufficient, then it is rational… to seek out more evidence [but when they do] they usually [act] in a wholly irrational way [seeking] evidence that will support their existing beliefs”  Irrationality, page 3

But if you’d already made what might seem like a rational decision – that what happened was unexplainable (or even inexplicable) – then you have no reason to believe your knowledge is insufficient. As humans, we are notoriously bad at weighing probability and seeking out truth rather than relying on confirmation bias. Everyone is (or has been) guilty of this; confirmation bias is insidious that way and it’s not the only effect that self-proclaimed rational people make.

Three errors I’ll bet you’ve made at some point

Halo effect – first impressions really do count – our judgments of a person’s abilities can be influenced by our overall impression of them. That’s why when skeptics see a celebrity endorsing a product, service or medical treatment that’s known to be bunkum, we hang our heads. It’s a basic appeal to authority and that’s really based soundly on the halo effect. It’s also directly related to the Fundamental Attribution Error (more on attribution theory).

The ‘sunk cost error’: have you ever got half way through a book that you thought was awful, but continued to read it because you’ve already got this far, you might as well continue? Bought tickets to something in anticipation of the event, then later forced yourself to attend when you were too tired or ill to go? Or, much more seriously, heard a politician insist that we continue fighting a war so that the lives already lost were not in vain? That’s the the sunk cost error – persisting “with a non-optimal course of action because [you] have already invested time or resources in it” (Macaskill & Hackenberg, 2011)

Being a real, human-faced skeptic, instead of a virtual avatar, makes you all the more inconvenient to people selling falsehoods

The point is that we must recognise we are all subject to beliefs that are wrong; making fallacies, judgement errors and falling into biases. I see little value in placing things into a hierarchy – providing rational explanations for observed phenomena we didn’t first understand is all part of the same aim – rationality. I’ve been to enough Holistic and Mystic events as an observer to realise that the trouble is that many of these beliefs have a big crossover and are intersectional. Sowing a seed of doubt or pulling out a block from that Jenga tower by being a real, live skeptic at an event, instead of a virtual avatar, makes you all the more inconvenient to people selling falsehoods.

Some of us can name all the illusions we experience, some of us know it’s an illusion, but some of us don’t know at all and an attribution error is made. The seemingly inexplicable phenomena is put down to a supernatural or paranormal explanation.

We’re all subject to silly beliefs and irrationality – but suggesting you’re above human fallibility is nothing but arrogance

References:

Hamilton, M. (2001) Who Believes in Astrology?: Effect of favorableness of Astrologically Derived Personality Descriptions on Acceptance of Astrology. Personality and Individual Differences 31, (6), 895-902.

Macaskill, A. C. & Hackenberg, T.D. (2011), Providing a reinforcement history that reduces the sunk cost effect, Behavioural Processes, 2012 Mar; 89(3): 212-8. (online at  http://www.ncbi.nlm.nih.gov/pubmed/22108673, retrieved 18/06/2012)

Sutherland, S. (1992) Irrationality: The enemy within. London, England: Constable and Company.

Tobacyk, J. J. (1988). A Revised Paranormal Belief Scale. Unpublished manuscript.

About Author

Leave a Reply