Open Thinkering

Menu

Tag: Buster Benson

TB871: Ambiguity and cognitive biases

Note: this is a post reflecting on one of the modules of my MSc in Systems Thinking in Practice. You can see all of the related posts in this category


You may have watched this video which was referenced by the brilliant Cathy Davidson in her book Now You See It. But perhaps you haven’t seen, as I hadn’t, a related one called The Door Study. TL;DR: it was one of the first confirmations outside of a laboratory setting of ‘change blindness’. What we perceive is usually what we’ve been primed to pay attention to, rather than simply us sensing data from our environment.

It’s a good reminder that, phenomenologically-speaking, much of what we experience about the external world, such as colour, doesn’t actually ‘exist’ in an objectively-meaningful way. We construct our observation of the environment; everything is a projection. As the module materials quote Heinz von Foerster as saying: “Objectivity is the delusion that observations could be made without an observer” (The Open University, 2020a).

When it comes to systems thinking, this is a key reason why the idea of ‘perspective’ is so important: the world is different depending on our point of view, and we can struggle to see it as being even possible to observe it differently. A good example of this, other than the familiar rabbit-duck illusion is the Necker cube (The Open University, 2020b):

A series of lines that look like a 3D cube

This is literally a flat pattern of 12 lines, so the fact that most of us see it automatically as a 3D object is due to our brains modelling it as such. However, our brains aren’t sure whether we should see the underside of the bottom face of the cube, or the upper side of the top face. As with the rabbit-duck illusion, our brains can see one of these, but not both at the same time.

Going further, we can take the ink blot below, taken from the module materials (The Open University, 2020c):

Using this diagnostically is usually referred to as a Rorschach test which many people think is pseudo-scientific. However, it is interesting as a parlour trick to show how people think about the world. For example, I see two people with their hands behind their backs, kissing. I’m not sure what that says about me, if anything!

Spending a bit more time with the ink blot, I began to see it as a kind of eye covering that one might wear at a masked ball. Switching between these two is relatively easy, as the primary focus for each is on a different part of the image.

There are similar approaches to the above, for example the Thematic Apperception Test (TAT) created by Henry A. Murray and Christiana D. Morgan at Harvard University, which presents subjects with an ambiguous image (e.g. a photograph) and asks them to explain what is going on.

The rationale behind the technique is that people tend to interpret ambiguous situations in accordance with their own past experiences and current motivations, which may be conscious or unconscious. Murray reasoned that by asking people to tell a story about a picture, their defenses to the examiner would be lowered as they would not realize the sensitive personal information they were divulging by creating the story.

(Wikipedia)

These approaches show how much we construct our understanding of the world rather than just experience it as somehow objectively it is ‘out there’. There’s a wonderful image created by JM3 based on some synthesis by Buster Benson which I had on the wall of my old home office (and will do in my new one when it’s constructed!) which groups the various cognitive biases to which we humans are susceptible:

Diagram titled "Cognitive Bias Codex, 2016" showing different cognitive biases organized around a central brain image into four categories: "Too Much Information," "Need To Act Fast," "Not Enough Meaning," and "What Should We Remember?".

As you can see, these are boiled down to:

  • What should we remember?
  • Too much information
  • Not enough meaning
  • Need to act fast

Here are some of the most common ten biases we are prone to:

  1. Confirmation bias: Favouring information that confirms pre-existing beliefs while discounting contrary information, often seeking validation rather than refutation.
  2. Fundamental attribution error: Overemphasising personality-based explanations for others’ behaviours and underestimating situational influences, particularly noted in Western cultures.
  3. Bias blind spot: Believing oneself to be less biased than others, exemplifying a self-serving bias.
  4. Self-serving bias: Attributing successes to oneself and failures to external factors, motivated by the desire to maintain a positive self-image.
  5. Anchoring effect: Relying too heavily on the first piece of information encountered (the anchor) when making decisions, influencing both automatic and deliberate thinking.
  6. Representative heuristic: Estimating event likelihood based on how closely it matches an existing mental prototype, often leading to misjudgment of risks.
  7. Projection bias: Assuming others think and feel the same way as oneself, failing to recognize differing perspectives.
  8. Priming bias: Being influenced by recent information or experiences, leading to preconceived ideas and expectations.
  9. Affinity bias: Showing preference for people who are similar to oneself, often unconsciously and based on subtle cues.
  10. Belief bias: Letting personal beliefs influence the assessment of logical arguments, leading to biased evaluations based on the perceived truth of the conclusion.

Of course, just having these on one’s wall, or being able to name them, doesn’t make us any less likely to fall prey to them!

References

TB872: Four pervasive institutional settings inimical to the flourishing of systems practice

Note: this is a post reflecting on one of the modules of my MSc in Systems Thinking in Practice. You can see all of the related posts in this category


On one side, there is an explosion of hyper-vivid, surreal organic forms in a kaleidoscope of ultra-bright, neon colors, representing the full force of human emotions in their most extreme expression. The forms are so intense and lively that they seem to leap out of the image. The opposite side presents the zenith of sterile, mechanical coldness: a stark, lifeless landscape of rigid, ultra-precise geometric shapes and complex machinery in grayscale, symbolizing an absolute void of emotion and type of dystopia. The dramatic disparity between the two sides creates a powerful visual shock, emphasizing the extreme dichotomy between unbridled emotional expression and absolute emotional suppression.

This post builds upon a previous one about ‘projectification’ and ‘apartheid of the emotions’ and deals with Chapter 9 of Ray Ison’s Systems Practice: How to Act in which he outlines four settings that constrain systems practice.

They are:

  1. A pervasive ‘target mentality’
  2. Living in a ‘projectified world’
  3. Failures around ‘situation framing’
  4. An ‘apartheid of the emotions’

When I wrote the previous post, because of the way this module is structured I had not studied the juggler isophor. Reading this chapter again with a new frame of reference is enlightening:

In my experience systems practice which only focuses on methods, tools and techniques is ultimately limited in effectiveness. This is particularly so at this historical moment because the organizational and political situation has generally not been conducive to enacting systems practice… [T]o be truly effective in one’s systems practice it may mean that changes have to be made in both practice and situations so that practice is re-contextualised.

Ison (2017, p.224)

A couple of days ago, I wrote about exactly this: that, from what I can see, governmental approaches to ‘systems thinking’ are very much about “methods, tools and techniques” in a world of targets and projects. Instead of understanding context and emotions, systems are framed as being ‘out there’ in the world (rather than human constructs).

While discussing the characterisation of natural resource issues as ‘resource dilemmas’, Ison (ibid., p.238-9) outlines a ‘framing shift’ which incorporates five elements:

  1. Interdependencies
  2. Complexity
  3. Uncertainty
  4. Controversy
  5. Multiple stakeholders and/or perspectives

What I like about this in relation to my own work is that these are often exactly the kind of things that hierarchical organisations (and most clients) want to minimise or avoid talking about. And I would suggest that it is this reticence that leads to an over-use of targets, rampant projectification, failures around situation framing, and an apartheid of the emotions.

What is possibly missing from all of this is the psychological element of working with others. This is related to, but separate to emotions, and is perhaps most easily understood through the grouping that Buster Benson has made of over 200 cognitive biases to which we as humans are subject:

  1. “There’s too much information to process, and we have limited attention to give, so we filter lots of things out.”
  2. “Lack of meaning is confusing, and we have limited capacity to understand how things fit together, so we create stories to make sense of everything.”
  3. “We never have enough time, resources, or attention at our disposal to get everything that needs doing done, so we jump to conclusions with what we have and move ahead.” (Benson, n.d.)

I’ve had the following image on the wall of my office for the last five years:

Buster Benson's Cognitive Bias Codex
(click to enlarge)

Just like the PFMS example, we deal in heuristics because of our human psychology. That means that we tend to simplify things based on prior experience, reducing complexity and uncertainty where possible, doing uncontroversial things so that we don’t have to get input from lots of people (and deal with their needs). It’s entirely understandable. But, as the subtitle and context of Ison’s book suggests, this isn’t going to cut it for dealing with “situations of uncertainty and complexity in a climate-change world”.


In terms of my own experience, I’m not even sure where to start. I began my career in UK schools, that is to say in institutions that are extremely hierarchical, deal in social reproduction, and are filled with staff members who (mostly) did well at school themselves. In addition, change is exogenous in this sector, coming from politically-motivated announcements from ambitious government ministers eager to placate the right-wing tabloid press.

As such, my experience of working in schools was of hard-working and well-meaning staff cosplaying what they thought people do in a business setting. Young people were reduced to numbers on a spreadsheet, and things might have worked very well in the classroom in practice, but they didn’t work well in theory, so they were canned. I loved teaching. I didn’t enjoy everything that was wrapped around it.

If education is a system to inspire the lifelong learning of young people by introducing them to a range of experience, which would be my framing, then the system was failing when I was a teacher, and is failing my own children.

I’m not going to rehearse my career history, but instead I’ll compare and contrast this with my current practice as part of the co-op of which I’m a founding member. In this work, although we have better and worse clients, we get to lean into the ambiguity, the uncertainty, and the complexity that results when humans work with one another.

We endeavour to call the way we work with clients a ‘partnership’ rather than simply working on a ‘project’. I’ve been inspired by people like Kayleigh Walsh, who we interviewed in Season 4 of our podcast, and how they bring their full selves to work. Even with straight-faced, straight-laced people who work for ‘serious’ organisations it’s possible to treat one another as human beings subject to good days, bad days, and all of the emotions that go with the various seasons of our lives.

Some of this has been brought home to me in the last week or so, with the contrast between two organisations. One, partly because of funding constraints, asked us to go through an involved, time-consuming process in order to respond to an Invitation to Tender (ITT). Despite the situation we were potentially going into being essentially unknowable without doing the research, we were being asked for project plans and all kinds of details at which we could only guess.

It reminded me very much of what Ison describes in Chapter 10 of Systems Practice, except we weren’t particularly in a position to suggest another approach; we just wouldn’t have got the work. To be fair to the people involved in the organisation, I think they knew that a different approach was needed, but they were constrained by the logic of the systematic approach imposed upon them. In other words, systematic thinking prevented a systemic approach.

If we compare this with a Theory of Change workshop we ran yesterday for a different organisation, then the difference in approach is clear. An example of the basic template we use for this, based on work by Outlandish, is below:

Theory of Change template with 'Final goal', 'Outcomes' and 'Activities'
(click to enlarge)

During the session, we surfaced differences between what came out of the user research with staff members compared with what is included in the reports they publish. We used this as an entry point for each member of the small team to fill in boxes underneath the prompts:

  • What we do…
  • …to influence these people…
  • …to have this impact in the world

As expected, this is not an easy thing to do, and each team member surfaced something slightly different. We then went round the circle twice, first asking everyone to give some context to the text they entered in their three boxes, and then asking for things that someone else mentioned that with which they would definitely agree (or disagree). From there, we attempted through structured conversation a synthesis to create an overall goal.

Sometimes, you just need someone to do some work which fits in as a piece of an extremely well-designed jigsaw. But the number of situations in which this is true is much smaller than most people imagine. In my experience, siloed working and cognitive biases mean that few of us can answer more than a couple of ‘why’ levels deep even in relation to work that is important to us.

As I’ve said before, what I really appreciate about this module, hard and time-consuming though I’m finding it at times, is that it’s a justification of an approach to life that I’ve carried with me from the start of my career. It’s refreshing to realise that I’m not alone in thinking that putting on a suit and tie and talking about KPIs and OKRs is not the right way to improve the world.


References

  • Benson, B. (no date). Cognitive biases. Available at: https://busterbenson.com/piles/cognitive-biases/ (Accessed: 31 January 2024).
  • Ison, R. (2017). Systems practice: how to act. London: Springer. Available at: https://doi.org/10.1007/978-1-4471-7351-9.

Image: DALL-E 3

css.php