Over the next few posts I will expand on personal and interpersonal strategies for connecting to the world. In particular I’m going to focus on detecting signal from the noise. How to tune into sense and meaning making.
Before exploring strategies lets look at the biological and neurological constraints that we have to work with.
Sensory salience
Our eyes can distinguish several million colours, our ears half a million tones, and our noses over a trillion different odours. (Entangled Life, Sheldrake. M)
Given the enormity of sensory data we are able to detect it is of no surprise that we have developed strategies to “make sense” of all these inputs.
For example, our eyes can detect peripheral motion at higher frame rates than what is directly in front of us (so you know when that wolf is pouncing!) but at the expense of colour (because you really don’t care at that point). At the peripheral, motion is salient and we detect it more efficiently.
In general our senses are searching for salient information in order to detect both threat and opportunity, with an emphasis on threat.
Preconception and pattern recognition
To be cognitively more efficient we ignore a lot of sensory input (the less salient), and use preconceived images based on expectation which we update with occasional new sensory input. This is how “slight of hand” magic tricks work, our preconceptions create literal blind spots.
In turn, our expectations are informed by pattern recognition, the process of mapping input to semantic memory. Semantic memory is built up over time and includes facts, ideas, meaning and concepts which are enmeshed with experience, and are culturally dependent. In other words, our expectations contain experiential and cultural bias, the cost of efficiency is confusing signal with noise.
Cognitive biases
Furthermore, what we determine to be salient in the first place, in part, is informed by our cognitive biases. Our biases become heuristic lenses used to assign salience to input that resonates with our preconceived expectations.
This dynamic feedback loop can increasingly lead us to confuse signal with noise. Simple examples of this are confirmation bias where we search for signals that confirm what we already believe, and salience bias, the tendency to focus on items that are more emotionally striking.
TL;DR Our biases and expectations lead us to assign importance to the wrong information in order to more efficiently “make sense” of the world.
To become better at detecting true signal we need to account for all this, but even adjusting for our own cognitive biases is an almost impossible task. This is where we need each other, it is much easier to recognise someone else’s expectations and biases than it is our own.
Pattern recognition has served us very well so far, affording rapid detection of both resources and threats. The increase in noise, particularly noise created by humans (from naïve bullshitting to PSYOPs) undermines the utility of pattern recognition. It is worth developing improved signal detection to have more accurate expectations, and afford better fit to reality.
Heuristics
Heuristic is an algorithm in a clown suit. It’s less predictable, it’s more fun, and it comes without a 30-day, money-back guarantee. - Steve McConnell
Heuristics are mental shortcuts to problem solving and self-discovery. We use them when looking for an immediate probability judgment or when we wish to lighten our cognitive load. We also employ heuristics when facing complex problems or incomplete information.
They are attractive as a decision-making tool because they are effective, unfortunately they can also be irrational and inaccurate. The trade off is between making an immediate decision and being accurate. This can lead to systematic errors and the forming of cognitive biases.
There are numerous examples but here are a few to give you a better taste of what I mean.
The availability heuristic occurs because some memories are more available to recall than others, we then use the ease of recalling the memory as an indicator of importance.
When we use the representativeness heuristic, we make probability judgments about the similarity between an object or event, and a prototypical example of a specific category.
For example, a specific category may be “wolf”, we see an object in the woods that is similar to a prototypical wolf and decide to run. This might be a false signal, but is better than being eaten!
However, this useful function can be lead us to stereotyping and making inaccurate judgments based on similarity rather than actuality.
This is when we determine the probability of an event, or another’s behaviour, happening based on how easy it is to mentally picture. We are constantly simulating everything around us in order to be able to predict the likelihood of events occurring.
Also known as the informational social influence is where people copy the actions of others. It is more prominent when people are uncertain how to behave, especially in ambiguous social situations. We see this in social media and in mob behaviour.
What are mental models?
A mental model is a tool for representing how we think something works. They are attempts to simplify the world so we can organise and understand complexity. As such they are used in many aspects of life to aid decision making.
They differentiate from heuristics in that they can be learned and consciously applied. However, the more ingrained a model is the more it will unconsciously influence your perception and behaviour. So it is worth making them conscious and questioning how well they fit to any situation.
Signal detectors
Mental models are much more useful in separating signal from noise as they are largely developed through observation and tested for their utility through application.
Many successful people such as Richard Feynman, Charlie Munger, and Naval Ravikant testify to the power of mental models to improve clear thinking and decision making.
Mental models can be found in mathematics, hard sciences (physics, chemistry, biology), economics, systems thinking, and even the faculty of human judgement.
Very short list of examples
There are far too many examples to list here, this is a huge topic worthy of proper study. So here are some common examples for context.
Mathematics — randomness and compounding
Physics and chemistry — inertia, leverage, alloying, and catalysts
Biology — ecosystems and niches
Economics — opportunity costs and utility
Systems thinking — feedback loops and algorithms
Human judgement — trust and denial
Circle of competence
The one mental model I will give some detail to here is key to separating signal from noise.
We all have areas in life where we are competent. The way to test this is to be honest about what we know, where our limitations are, and to have a good idea about where we can learn more.
It takes time to become competent, and we cannot be complacent, things change, and we need to keep learning and practicing to remain competent.
We also have ego’s, and ego’s have blind spots. This is where we can misinterpret noise as signal, either because we are not competent enough to evaluate, or because the noise suits our biases.
Building a framework of mental models
There are several excellent books of mental models, the Farnham Street series of “The Great Mental Models” by Rhiannon Beaubien and Shane Parris are worth studying.
The value of creating a personal framework of mental models should not be underestimated. Having a reliable and consistent approach to thinking can help adapt for personal biases and heuristics.
In addition, given the sheer volume of information we are exposed to on a daily basis, a reliable filtering system can help you choose what to ignore. Ignoring as much noise as possible is a key skill that helps us select where we need to pay attention in order to find signal.
As with all methods, working alone is subject to self deception, working with others increases your chances of success. More on this in the next post.