- We come programmed with certain mental mechanisms, also known as cognitive biases, that have and still do help us survive.
- One of these mental mechanisms is called Illusory correlation where we make false correlations between two things even when a correlation doesn’t exist
- Illusory correlation is the tip of the iceberg. Looking deeper we begin to see how ‘knowledge gaps’ lead to the illusory correlation effect and how in turn, this leads to confirmation bias.
- This amalgam of different cognitive errors is why we believe certain diets or training programmes hold magical properties and concomitantly become die hards for certain diets, foods, or training protocols, and even in the face of disconfirming evidence cling to [faulty] beliefs.
In 2002 Zambia faced one of the worst droughts in decades, forcing them to declare a food crisis. On the verge of starvation, Zambians had resorted to boiling poisonous wild roots and killing protected Elephants for food.
Within weeks of the crisis being declared; the United States had sent thirty-five thousand tons of food in aid: enough to see them through the drought, and up until the next harvest.
But, what happened next is probably one of the most surprising [and shocking] turn of events that I’ve heard of: The Zambian President, Levy Mwanawasa, rejected the aid.
Of course, there were the typical speculations: the presidents a tyrant, the food wasn’t edible enough etc, etc. none of which turned out to be true.
The real cause for this quandary? The two words plastered on the side of the food crates:
The Zambian President declaring:
‘We would rather starve than get genetically modified foods’.
To the Zambians, this wasn’t food, but ‘poison’.
If you’re reading this and thinking what the shit? You’re right – what the shit, indeed. However, while the president’s reaction may seem absurd or irrational; the mental error that caused him to reject aid in the face of a starving country is the exact same mental error that causes us to believe things that aren’t necessarily true.
Let’s get into that.
What do Carbs, Diet Cults and Broscience have in common with the Zambian food crisis?
Ok, I had intended to answer that with a hilariously witty piece of prose. But, I couldn’t think of anything. Sooo, how about this instead:
Aaaand, moving on.
All of the aforementioned are tightly interlinked by the enigmatic red thread of Illusory Correlation.
Illusory Correlation is a mental error we make where we form a relationship between two separate variables [people, events, behaviours etc] even when a relationship doesn’t exist.
This false association is formed because our brains need to make correlations between two things as fast as possible for our survival: Hmm, that branch on the ground looks an awful lot like a snake. I wonder if it is a snake?… Or maybe it….*bite*. Yeah, that was definitely a snake. My bad.
Instead, unlike our naive adventurer above, you’d jump back realising it was only a branch, after the fact.
See, our mind hates uncertainty, and randomness and needs to make connections even when there are none because it has a need for order and control. Everything must have a meaning (even if there is none) and If it can’t find a meaningful pattern, you best believe it will go out of its way to create one.
Let me show you. Look at this image:
Yeah, the image above, that consists of a group of black shapes inside a larger white square….oh, what was that? You see a face?
This is your brain finding meaning in a seemingly abstract image.
You know that the image is just a collection of shapes, but no matter how hard you try to explain this to yourself, you can’t help but see a face.
Undoubtedly, our brain is trying to help us out with this mechanism, the problems begin when this mechanism seeps out from its role of keeping us safe and into our thinking in every facet of life.
So, what causes illusory correlation to occur? This mental error is actually a melange of different cognitive errors that come together to form faulty correlations.
And that’s what we’re going to be delving into in this article.
Data Points & Mental Models
From the time you’re able to understand the world to this exact moment in time sat reading this, and beyond, you’re collecting data: thoughts, ideas, opinions etc – these data points culminate in shaping who you are and how you see [and interact] with the world around you.
The culmination of these data points results in the creation of metaphorical data files stored in your brain called ‘mental models’.
These mental models dictate how you see the world, solve problems and think.
But, there’s a small catch that comes with the creation of these mental models.
Knowledge Gaps & Confirmation Bias
Mental models are only the beginning of the rabbit hole that is illusory correlation.
If mental models are built on the information you’re exposed to; the more you expose yourself to one line of information; the stronger the mental model for that belief becomes.
And, the more you see of this information the deeper this belief cements itself into your consciousness.
The deeper this belief cements itself, the more you begin to dismiss other bits of information that may oppose, or not fit with, what you believe. And it’s this asymmetry of input – what you believe versus what you choose not to – that creates ‘gaps’ in your knowledge.
Faced with these knowledge gaps, our mind, in a desperate attempt to make sense of things is forced to make correlations between the limited information [mental models] it has access to.
Thus, you think two things are correlated, like so:
X* = Z [the result]
*X being a diet, supplement or training programme, or an action.
Whereas in actuality, it was more like this:
X + Y* = Z
*Y being a critical part of the equation you missed due to knowledge gaps.
Once these faulty correlations are made the mind starts to become more ‘hyper-aware’ of the information you’re allowing it to be exposed to; and with enough exposure to this one idea or belief you slowly edge deeper into the inimical terrain of confirmation bias. Now everywhere you look you’re going to see ‘proof’ that what you’re doing is right, encouraging you to carry on even if it’s wrong.
Illusory Correlation In The Fitness Space
The fitness space is a playground for illusory correlation.
And, to show you how – I’m gonna hypothetically thought experiment the shit out of this.
Tony has been eating the prototypical western diet: high carb, high fat, very low protein. Working an office job, and one too many late night cuisines have resulted in Tony becoming a bit….ermmmm…cuddly, over the years.
Tony realises this and in an attempt to reduce his cuddly points, he decides he’s going to start a diet to shift said cuddliness.
Fortuitously, as this thought enters our cuddly protagonists mind, he sees an ad for a new diet that’s been taking the country by storm – ‘The X Diet’.
‘That’s what I’ll do’ thinks Tony.
So Tony begins. And just like any diet, it comes with a set of guidelines: Tony can only eat certain foods while removing other foods and doing the accompanying exercise programme.
As a result, Tony has gone from consuming a typical western diet to a diet that’s filled with veggies, fruits, lots of protein and ‘heart healthy whole grains’.
In just a matter of weeks, Tony has begun dropping cuddliness and is beginning to resemble the guys on his favourite magazine covers.
‘This diet is fucking amazing!’ exclaims an exuberant Tony
Soon Tony starts to hang around with people at work who are also following this diet, on finding out that there’s an official facebook group for the diet leads him to join that also.
Tony’s engrossed with information that reinforces his beliefs that this diet is magic: everywhere Tony looks he sees people sharing success stories, transformation photos, ‘research studies’ and articles on the efficacy of the beloved diet.
Then, one day, as Tony is working out at the office gym, he notices the new guy, also working out. The guy is in incredible shape, so Tony decides to spark up a conversation:
‘Sup new dude, I see you’re in awesome shape: you must be following The X Diet also’
‘Oh, no, no. I follow the Y diet’ replies the new guy
Tony feigns a smile; nodding slowly ‘…Oh’, Tony’s perplexed. This new guy has presented information that contradicts everything Tony believes.
The fact that another approach could possibly work is unthinkable.
So, Tony politely thanks the new guy for his time, and goes back to finish his set of bicep curls.
‘Ha. Sure. ‘Y diet’…pffttt… my ass’ grumbles Tony, angrily, as he walks away.
While I did take a bit of creative liberty with the story, it isn’t far from what actually happens:
We’re exposed to an idea -> we try the idea -> we get awesome results -> if there are gaps in our knowledge we’ll draw faulty correlations -> thus starting the downward spiral into confirmation bias.
The longer we stay here and the more we’re exposed to the same information the belief keeps strengthening*, and soon, this one diet or protocol becomes elevated to nutritional divinity.
*Fun fact: this is actually used in marketing and advertising, also known as classical conditioning. Ever seen an advert for a certain food or product and then upon seeing it on the shelf on your next grocery run, ended up buying it and not being able to explain why? Mhmm. Exactly.
As I touched on earlier: our mind’s an explanation-seeking machine and It takes too long to sit down and contemplate on the why’s and the what’s; the how’s and the who’s: it’s far easier for our mind to draw a correlation between the most obvious factors – like the protocol used and the results produced – instead of also considering all the other variables that could have played a role, like:
— Does this new diet restrict certain foods, thus – are you maybe, possibly, kind of eating less food now?
— Is the diet giving you a set of rules to adhere to, like say, foods you can and can’t eat, thus less processed crap in your diet; resulting in improved health?
et cetera, et cetera.
But once these correlations are made, it’s much ‘safer’ to stick to the known, than go out into the unknown that is novelty.
The result? People become die hards for certain diets or foods; ideologies or principles – detoxes, fasting, cleanses etc. and anyone or anything that disagrees or contradicts the held belief is exiled to the annals of their ignorance.
So, Uh, What Do We Do?
Cognitive biases, like Illusory correlation and confirmation bias, have evolved over thousands of years. And we can’t suddenly expect to up and change thousands of years of programming in a moment of capriciousness. Biases, like most things, are context dependent – they could be bad or they could be good, depending on the situation – but, being aware of them allows us to make sure we’re staying on the right side of context and not making erroneous correlations and correcting course if and when we do.
What I’ve suggested below are some of the things that have helped me.
Expose yourself to new ideas
We’ve spoken about mental models and data points, and that the more of one thing [idea, thought, belief] you feed your mind, the stronger that single belief grows and the more gaps in your knowledge you have.
To combat this: expose yourself to more diverse ideas.
‘Nobody is above cognitive influence — you need to keep your input diverse otherwise your mental models will be a regurgitation or influence of the small minority you’re exposing yourself to. Independence of thought comes by exposing yourself to different people/ideas constantly.’
The more you can expose yourself to new ideas, the more ‘gaps’ in your knowledge you’re able to fill, and the larger your palate for thinking grows; minimising your chances of drawing faulty correlations.
The key is to keep yourself at the intersection of different ideas rather than picking an ‘idea corner’ and setting up shop.
Being at this intersection allows you to employ Opposable Thinking: the ability to critically think about something from different perspectives. Instead of seeing things through an absolute mindset, you free yourself to be open to other possibilities. 
‘The Scale of Wrong’
‘Wrong’ or ‘right’ ideas are context [and perception] dependent; what may be wrong for one person or situation, may well be right for another person or situation. While this holds true for the most part, even ‘context’ has limits. This is where, what I call, ‘The Sliding Scale of Wrongness’ comes in.
The Sliding Scale of Wrongness represents the three different types of wrong:
- ‘Meh’ – beliefs that may be wrong but don’t really matter that much, or make that much of a difference [fasted cardio, post workout shakes, ‘bro-splits’]
- ‘Hmm’ – this is the ‘grey area’ [or in scientific parlance n=1] where experimentation and anecdote come in. It may or may not be proven [yet] but there’s some evidence that it could work. This area has the potential to either provide results or not provide results without causing harm [to give an example of this, during my keto experiment I realised that it was a lot easier to drop fat from my more ‘stubborn’ areas with a less aggressive deficit]
- ‘Dangerous’ – the beliefs that make a massive difference and could potentially be dangerous or harmful* [this is where the Foodbabes, and guys who name themselves after certain types of fruits like to hang]
*when I say harmful I’m referring to both health and making progress with your goals.
It should go without saying that we want to stay near the latter and middle part of the scale and as far away from the further end as possible. Even anecdotes or conjecture need to stem from some grounding in evidence and not plucked from the realms of pure fiction – this isn’t Harry Potter, bro.
This is why it’s important to be open to new ideas and leave old [faulty] ideas accordingly. The more you stick to one idea or belief, the more likely you are to start treading further towards the ‘dangerous’ end of the scale as you try to apply that one mental model to everything, or, as the saying goes ‘to a man with a hammer, everything looks like a nail’.
Some things to consider:
- Read / listen / watch things outside the scope of your direct circle of interests
- Be open to hearing other people’s POV if they don’t fit your own [you’ll be pleasantly surprised by how many people actually agree with what you have to say]
2. Adopt An Empiricist Mindset
Being an empiricist means taking into consideration what the evidence says, but at the same time not being afraid to get in the trenches and do some experimentation.
However, with experimentation comes some guidelines. Here’s how to experiment and live to tell the tale:
Let’s assume you hear about a new diet or training programme that is apparently getting people amazing results, and you want to give it a go –
Firstly: is the protocol safe?
Yes, yes, I know that ‘safe’ will vary from person to person, but some things to be weary of:
- Crash diets [uber-aggressive calorie deficits; more than 1000 calories below your maintenance a day].
- Extremely low calorie diets for an extended period of time.
- Needlessly restricting certain foods [note that I said needlessly, meaning: you’re restricting foods because you think you have to, opposed to personally choosing to]
*as always, consult your Doctor before starting a new diet or training programme 
Secondly: Have you done your own research?
Speak to people who have tried the approach, get expert advice, make sure you know as much as possible before diving in.
Once you’ve found out it’s safe, gathered as much information as you can – try it for yourself and collect data [see point 4 below]
When I experimented with ketogenic dieting, this was the exact three-step process I went through before starting. 
3. The ‘Inversion Thinking’ Model
During the 5th century, Diagoras – a poet and philosopher – was shown painted tablets bearing the portraits of worshippers.
He was told that the worshippers had prayed, and thus, due to praying, had survived a subsequent shipwreck.
The implication was that praying protects you from drowning.
Diagoras listened, and then replied: “Where are the pictures of those who prayed, then drowned?”
This story is used to illustrate The Inversion Thinking Model. When someone presents you with an idea like:
Well, I did X [x being a diet, removing certain foods, using certain training programs or protocols etc.] and got amazing results – to also look for all the people or instances where X was done and didn’t provide results.
This goes hand in hand with being an empiricist. To quote Nassim Nicholas Taleb:
‘Empiricism is not about not having theories, beliefs and causes and effects: it’s about avoiding being a sucker. An empiricist facing a series of facts or data defaults to suspension of belief while others default to a characterisation or theory. The entire idea is to avoid the confirmation bias and prefer to err on the side of the disconfirmation/falsification bias’
For years I’d held the belief that a ketogenic diet could in no way be conducive to physique composition. I had exposed myself to the single idea – carbs being a necessity -and had unwittingly fallen victim to the confirmation bias. It wasn’t until I used this model of thinking and saw that there were people getting results on a ketogenic diet, that I was able to crawl my way out of this bias and experiment with the diet myself – which changed my perspective on many things [empiricism]. 
This is the basis of The Inversion Thinking Model: question everything, and for every belief you have, look for instances where the opposite may also hold true.
4. Data & Reflection
We learn best by reflecting on an experience, not just simply going through the experience. Reflection allows us to look back on why we did things and how we can either improve them or change them in the future.
Of course, to be able to reflect, we need something to reflect on. This is where data comes in.
Things to consider:
- Journal – tracking thoughts, feelings, emotions, etc
- Progress photos
- Measurements [body measurements, body fat%, strength and performance etc]
- Notes – what you did, methods used, what happened?
The more tangible data you have, the more you can reflect, learn and improve.
Learning by Unlearning
This is the hardest, yet most important part, thus I saved it for last.
Dogma gets a bad rap.
The things we’ve grown up with – environment, culture, friends/family etc. – have shaped us into who we are. We need some amounts of dogma: the things that make us ‘us’ and keep us connected to our roots. It’s our identity, and we gotta own that. [what I dub local dogma]
It’s when local dogma becomes global dogma: shutting yourself off to new ideas because you think they, somehow, are an attack on your individuality, that it becomes a problem.
It’s not binary: you can be true to yourself and still be open to new ideas and exploration.
In order to do that, you have to be willing to unlearn what you’ve learnt. Josh Waitzkin describes the importance of this in his fantastic book The Art of Learning:
‘Over time each chess principle loses rigidity. Soon enough, learning becomes unlearning. The stronger chess player is often the one who is less attached to a dogmatic interpretation of the principles. This leads to a whole new layer of principles — those that consist of the exceptions to the initial principles.
The more ideas you have access to, the less chance of falling victim to mental errors like illusory correlation and the more you open yourself up to learning even more.
“Even if facts themselves lead to understanding, you can’t have understanding without facts. And crucially, the more you know, the easier it is to know more. Memory is like a spider web that catches new information. The more it catches, the bigger it grows. And the bigger it grows, the more it catches”
The Wrap Up
I know I’ve spent the entirety of this article arguing for the need to be open minded and not sticking to faulty beliefs, but I’m also fully aware of how easy it is to say, and how hard it is to do.
Changing what we believe, or simply being willing to another idea is hard, mainly due to belief perseverance: the tendency to cling to what we believe even in the face of disconfirming evidence. With that said, however, if you’ve read up ’til this point, then I’m going to assume you’re at least open to the idea.
And, look. I’m not naive enough to think that just because you read this you’re automatically going to drop all your current beliefs and run off to experiment with new things, and expose yourself to new ideas [I only hope that you do].
But, here’s the thing: the onus is on all of us to keep our ideas updated by doing our own research; questioning things; continually learning and expanding our knowledge, and being willing to disregard old ideas if and when they turn out to be wrong [or harmful].
Otherwise, just like the Zambian President, we’ll continue drawing false correlations between things when none may exist; strengthening faulty or outdated mental models; and keeping the gaps in our knowledge forever growing.
And hey, you might not have a starving country to deal with, but this fitness stuff is important too, right?
‘Stop the process of identifying with what you already believe. Who the hell cares what you believe, you should be attending to the things you don’t know. Which is a much larger space than the things you know’.
- Originally came across this story in ‘The Rational Animal’
- The ‘opposable thinking’ concept is my own take on the idea of ‘integrative thinking’ from The Opposable Mind
- If you’re looking for medical advice on the net, I highly recommend Dr. Spencer Nadolsky. Not only is he a doc [the real kind], he also lifts. So he better understands some of the nuances your doctor may not [with all due respect to your doctor]
- Luis Villasenor aka DarthLuiggi of Ketogains Reddit fame was my go to guy during the Keto experiment. He’s been in ketosis for over 15 years and was a wealth of knowledge for me to draw on. He’s also just a genuinely awesome guy and the Ketogains community is probably one of the most welcoming groups I’ve ever seen on Facebook. If keto is something you’re interested in checking out, be sure to check out the group and hit up Luis.
- I use the example of a Ketogenic dieting to illustrate the time I fell victim to these biases. Just because I found keto effective, doesn’t mean you will.
- An honourable mention and thank you to my friend Ryan Nayr for telling me about the awesome blog Farnham Street. Who’s blog and writing deals heavily with mental models.
If you enjoyed this, you'll love my emails.
I write articles like this one just for my email subscribers. These can be quick and random thoughts, or really in-depth pieces. All of which aren't posted anywhere else. And I want you to receive them, too. Just click the button below.I'm in