Think Smarter not Harder: Cognitive Load, Confirmation Bias and Mental Accounting

WINTER 2021/22

In the third instalment of our behavioural science guide, we highlight the role of cognitive load in the customer journey, as well as discussing the theory behind confirmation bias and bad mental accounting. Read on to learn how managing system 2 correctly can improve your customer journey.

Last time we went through a brief crash course in Dual Process Theory, which is a pragmatic way of framing the way we make decisions. People use a mixture of system 1 – fast intuitive thinking, and system 2 – slow evaluative thinking. If you read the last post, you should have an idea what each of these systems are and may be getting ready to prompt system 1 responses within decision structures relevant to you. If you haven’t read it already, here’s the link.

Here we’ll explore the realm of system 2 errors, taking a more theoretical view of a few key types of bias. By the end of this post, you’ll have a deeper understanding of the interplay between systems 1 and 2. You’ll know the dangers of excessive cognitive load in the decision-making process and how to optimise it, and also get to grips with a few examples of bias and bad mental accounting.

System 2 errors

Cognitive Load Theory (CLT)

System 2 may error in many, many ways, but so that you have a good general understanding of this type of error, I’ll only highlight a few of the most common: these are excessive cognitive load, confirmation bias and poor mental accounting. Of these, potentially the most commonly exploited is our capacity for cognitive load, which is the amount of mental effort we have available to perform a task at any given moment. Rather than explain only the theoretical implications of cognitive load theory, we’ll also look at the practical ramifications.

So, what is cognitive load? Simply put, if we’re given too much to think about at once, system 2 thinking can quickly defer to heuristics to simplify problems. This reduces cognitive load but at the same time exposes us to error. This is an unethical way of driving a simple response. And unlike the examples we discussed last time, we are often influenced under these circumstances via dark designs which don’t have our best interests at heart.

We’ve all been there – think about the last time you agreed to a website’s cookie policy or signed a privacy policy for a new app or service. In general, the huge volumes of information contained by these pages is enough to trigger an immediate deferral to a system 1 response. People are inclined to just click ‘next’ or ‘agree’, rather than wade through reams of information. This is an example of dark design, or dark patterning, whereby the structure of a choice is set up to push a user in a direction they would definitely not want to go otherwise. We should all avoid this when designing influential customer journeys, because it’s not only a terrible customer experience but it’s also morally reprehensible.

At ContactEngine, we try to bear cognitive load in mind when organising conversations within a customer journey. But the nuances of this in the context of choice architecture can be applied to many other modes of decision making. To start thinking about cognitive load at any given decision point, consider the intrinsic, extraneous, and germane loads that occur within the context of the decision. This deserves its own post, but some good practices to bear in mind are...

Structure Intrinsic Load

Minimise Extraneous Load

This is the amount of processing power required to understand the logical or instructive content of the task. At ContactEngine we heavily use the Barbara Minto principles of structure to organise ideas we want to convey – situation, complication, question/action. This helps organise intrinsic load by presenting information in a cogent, easy to understand way. When thinking about your own decisions, this can mean reducing the complexity of a decision down to its essential and necessary components, which can aid precision and reduce bias.

To influence decision making in others, you can use similar logic. One way of thinking about this, and a recurring theme within the world of ‘nudging’, is to make it easy! Making it easier to make a decision can solve a host of business and personal problems. Limit complexity wherever it is appropriate.

Extraneous load is the amount of superfluous mental processing required to understand the visual context of the task. In UX design, this means reducing clutter like unnecessary imagery, excessive decoration and confusing interfaces and so on.

If you’re writing and giving presentations, designing emails or wireframing user interfaces, you will need to bare extraneous load in mind. The key takeaway to bare in mind, is always consider the mode of communication – what is the best visual framework for the information you want to convey and how can I minimise distraction?

Optimise Germane Load

Summarising CLT

Germane load is the cognitive load required to process, create and automate mental ‘schema’. For long term influence and behaviour change, there is a balancing act required to both increase germane load, which aids learning, whilst also keeping intrinsic load in check! This is because there is something of an inverse relationship between germane and intrinsic loads, and a certain level of difficulty is desirable for long term learning. But in the context of influencing shorter term decision making, the key is to utilise what’s there already and to stay consistent.

Making use of common ‘schema’ stored in the long-term memory, makes decision making and recall easier, through navigating easily recognisable patterns. ‘View my basket’ and ‘checkout’ buttons are good examples of well defined, and recognisable schema used on online shopping sites. When introducing people to newer applications, attenuating a user interface to be stable and consistent is a key priority for reducing friction and optimising germane load in the future.

The interplay of system 1 and system 2 is constant and complimentary. When we don’t manage these cognitive loads appropriately, mental effort can become excessive. We then subconsciously ditch system 2 for heuristics or system 1 type thinking. It’s this juncture, the substitution of reasoning for habit, which is the primary cause of error in system 2 problems. So, whilst cognitive load is more aptly described as a system 2 issue, there’s definitely an element of the prior at work here.

Confirmation Bias

Understanding the relationship between these two modes of thinking is a crucial part of understanding why we approach problems the way that we do. The crux is how our experience and evaluations are constructed. Without getting too deep into this, consider the way that we represent problems: our system 2 thinking relies on the outputs of system 1, which is constantly running in the background.

This means that our evaluations are innately and implicitly biased, since our representation of the world has been constructed via probabilistic and anecdotal models, rather than by an objective system. Even when we really think about an issue or problem and are certain we have given it a rational and objective treatment; it’s possible to slip into bias when deciding. This is because of our individual differences in motivation and prior beliefs.

Flat earthers are a great example: some people believe that the earth is flat, and there is no evidence which can convince them otherwise because their interpretation of the ‘evidence’ confirms their bias beyond doubt. Anti-Vaxxers are in this boat too: there are people who genuinely believe that coronavirus vaccines contain mind controlling chips, their theories confirmed by esoteric ‘evidence’!

These kinds of beliefs are the product of a few things, but primarily one of the most insidious errors of all – confirmation bias. Where we seek out, and sometimes manufacture proof that our prior beliefs are in fact, facts! This happens because of the existence of motivated beliefs. System 1 more easily generates representations, which coincide with our existing points of view, ratifying previously held bias.

Thereafter, when we evaluate new evidence or problems based on these representations, we generate a belief confirming our previously held assumptions. In exercising our system 2, and performing a motivated evaluation on biased data, we fall victim to confirmation bias.

Mental Accounting

Transaction Utility and Sensitivity to Changes > Levels

Confirmation bias demonstrates how human beings assign irrational value to data and evidence, which is the main downfall of system 2 – it’s vulnerability to our value system. And further evidence of this comes from ‘mental accounting’: in the same way that humans moralise data in line with ideologies, it’s also routine for people to assign subjective values to money. The study of subjective value assignment to money, was the push that catapulted behavioural economics out of the classical theories of the 70s. Kahneman’s prospect theory [1] showed that ‘homo economicus’ would never make the decisions that real people make about money!

As a species we do a very bad job at accurately assessing value. System 2 is littered with these sorts of biases, which fall into the category of ‘mental accounting’. These biases concern our handling of money, probabilities, and risk. There are many of these, but for brevity let’s look at just a few examples. The most salient of these is our sensitivity to change over levels.

Winning a million pounds would feel pretty amazing. But losing 700k the week after will feel disproportionately terrible, despite being  00k above where we were initially. This seems reasonable enough to me, and that’s the problem - in a world where human beings saw only the objective utility of transactions, we would care less about changes, and more about overall levels.

We’d also feel the change in levels as equally pleasurable / painful. Winning a million pounds would feel as good as losing it would feel bad. But on the whole, we experience losses as significantly more painful than the utility [2] of gains. System 2 evaluates these changes along subjective lines.

Linked to our sensitivity to changes over levels, is a heuristic Thaler calls transaction utility – “a consumer's behaviour depends not just on the value of goods and services available relative to their respective prices, but also on the consumer's perception of the quality of the financial terms of the deal.” In other words, the context of the transaction can affect our perceptions – so whether we feel the deal is fairer given a discount, or more exclusive, can affect our readiness to accept a given transaction. However, these criteria should have no objective bearing on our willingness to make an exchange of goods or services!

The Contrast Principle

Next Time

Perhaps my favourite example of poor mental accounting comes from Dr. Robert Cialdini in his excellent book ‘Influence: the psychology of persuasion’ [3]. This is the contrast principle:

“There is a principle in human perception, the contrast principle, that affects the way we see the difference between two things that are presented one after another. Simply put, if the second item is fairly different from the first, we will tend to see it as more different than it actually is. So if we lift a light object first and then lift a heavy object, we will estimate the second object to be heavier than if we had lifted it without first trying the light one.”

In monetary terms, the contrast principle works similarly. We view a £100 purchase as significantly less expensive if made after a £1000 purchase. This error is exploited prolifically, perhaps most overtly within new car purchases. During these transactions, additional, high margin products are sold at rates most would not pay if they were upgrading an existing vehicle via separate transactions.

In these later examples, it should be becoming clear how both systems 1 and 2 may work together and apart to cause error. At this point, we’ve discussed many different kinds of error, and you should have a taste for how pervasive these forms of biases may be in everyday life. In the next post, we’ll discuss some specific examples of influencing, with a brief dip into two research papers, each offering insights you can use to nudge your customers.

With your developed understanding of how and why people make errors, you’re now in an excellent position to start digging into these different concepts and nudges! The following articles each teach something different about the way people make decisions, garnering important takeaways for when you’re designing nudges or working within any given choice architecture.

[1] Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263-291. doi:10.2307/1914185

[2] Richard Thaler (1983) ,"Transaction Utility Theory", in NA - Advances in Consumer Research Volume 10, eds.Richard P. Bagozzi and Alice M. Tybout, Ann Abor, MI : Association for Consumer Research, Pages: 229-232.

[3] Cialdini, R. B. (2007). Influence: the psychology of persuasion. Rev. ed. ; 1st Collins business essentials ed. New York: Collins.

Albert Evans

Implementation Analyst
ContactEngine