Open Thinkering

Menu

TB871: Managing variety using amplifiers and attenuators

Note: this is a post reflecting on one of the modules of my MSc in Systems Thinking in Practice. You can see all of the related posts in this category


Balancing autonomy and control is crucial for maintaining a viable system. This balance is especially important in a software development team, where management seeks to guide the complex processes of development, testing, and deployment effectively. The diagram below illustrates this within a software development environment, with the rest of this post exploring how ‘amplifiers’ and ‘attenuators’ help manage the variety in such systems, ensuring effectiveness and coherence.

Diagram titled "Managing variety within software development teams" showing interconnected sections: "Environment," "Operations," and "Management,” linked by grey arrows forming a loop.
Diagram based on an original, which used a different example, in the module materials (The Open University, 2020)

Autonomy and Control

Autonomy refers to the degree of freedom and self-direction possessed by parts of a system. Control is the ability of the overarching system to direct these autonomous parts towards a common goal.

According to Ashby’s Law of Requisite Variety, for a system to be viable, the variety (i.e. complexity) in the controller (management) must match the variety in the system being controlled (operations). Achieving this balance is often challenging due to the inherent complexity and unpredictability of the wider environment of software development.

Amplifiers and Attenuators

To manage this complexity, systems use amplifiers and attenuators. These help align the variety between management and operations in an attempt to enable effective control without stifling autonomy.

Amplifiers increase the system’s capacity to handle variety by making use of additional resources and/or decentralising decision-making. In the software development example, this could include:

  • Automated testing: using automated testing tools increases the variety of tests that can be conducted without overwhelming the developers.
  • Flexible work schedules: implementing flexible working hours allows the team to adapt to varying workloads more effectively.
  • Delegated authority: empowering team leaders or senior developers to make decisions on the spot can address issues promptly, enhancing responsiveness and reducing the burden on upper management.

Attenuators reduce the variety that the management system needs to handle. Again, in our software development example, this might look like:

  • Segmented development phases: instead of tackling the entire development process at once, projects can be broken down into phases like development, testing, and deployment, reducing the amount of complexity involved.
  • Standardised development frameworks: implementing frameworks like Agile or Scrum can help ensure consistency and predictability, which in turn can simplify management oversight.
  • Departmental organisation: dividing the team into specialised units (e.g., front-end, back-end, quality assurance) helps manage complexity by narrowing the focus of each group.

A balancing act

The key to a viable system lies in the delicate balance between autonomy and control. Too much control can stifle creativity and responsiveness, while too much autonomy can lead to chaos and misalignment with organisational goals.

Amplifiers and attenuators are helpful tools in achieving this balance, as they ensure that management can effectively guide operations without being overwhelmed by complexity, and that operational units have the freedom to adapt and respond to immediate challenges.

References

TB871: Systems in my situation of interest

Note: this is a post reflecting on one of the modules of my MSc in Systems Thinking in Practice. You can see all of the related posts in this category


For this module, my area of practice is community development and wellbeing, with my situation of interest being library services. Zooming into that situation, we find a number of systems in play. In the diagram below, I have highlighted (full yellow small circles) the ones I deem most important:

Activity 3.14 in this module informs me that, later, I will choose one of these to work with for the remainder of this section. If I’ve understood properly what’s required of me, that will probably be Lifelong learning in a library context.

Although I need to learn more about the Viable System Model (VSM), I think applying it to this area will be interesting, as I’m pretty sure the library service (at least where I live) is very much out of step with its environment.

Painting over problems with AI in the third sector

A breeze block wall being painted over. An image of a door is being painted onto the wall as well.

Attending professional events can often reveal the wildly different mental models that underpin the work that we do. This was particularly evident to me at an event I attended yesterday, where the speakers’ worldviews seemed to differ significantly from my own. It’s a reminder that even within sectors where we assume shared values, such as the third sector, we can understand and interpret the world in vastly different ways.

For instance, the speakers at this event clearly demonstrated that you can work in the third sector and still uphold capitalist values. For example, focusing on ‘closing the gap’ without questioning why, in fact, the gap exists in the first place. To my mind, it’s not enough to merely address disparities; we should be challenging the structures that create these disparities.

Advocacy and activism should be integral to third sector work, pushing for systemic change rather than just mitigating symptoms. Yet much of what I heard was a ‘hope’ that people won’t be left behind in an inevitable AI-driven future, without a critical examination of how this future is shaped and who it benefits.

I also encountered some confusing references in passing to ‘AI literacy’. This term was used in a way that often lacked clarity and coherence. In my thesis, I argued that new literacies are not thresholds to reach but conditions to attain. AI literacy should be treated no differently from other digital literacies, requiring deliberate practice and an understanding of underlying mechanisms. It’s about encouraging and developing ‘habits of mind’ that allow individuals to navigate and critically engage with AI technologies.

We’ve been exploring definitions at ailiteracy.fyi, and I’m convinced that, as with other forms of literacy, definitions are a power move, with individuals and organisations seeking to dictate what does or does not constitute ‘literate practice’. AI literacy is one of many digital literacies involving not only technical skills but also an understanding of the ethical, societal, and economic implications of AI. Feel free to read about the eight elements which underpin this here.

Going back to third sector organisations and AI, the rush to adopt this particular technology seems to be mainly focused on increasing service efficiency due to limited budgets and challenges around funding. This is necessary due to a lack of funding, which is a symptom of our capitalist system, with its ‘winners’ and ‘losers’, inevitably leaving whole sections of the population behind.

Organisations find themselves in a position where they must continuously do ‘more with less’, driving them to embrace technologies that promise efficiency without questioning the broader implications. This often leads to a superficial adoption of AI, focusing on immediate gains rather than long-term, sustainable, and equitable solutions.

We need to think differently. If we can’t adopt a holistic and inclusive perspective towards humanity, how can we expect to do so for our interdependent ecosystem? While AI has the potential to aid in climate mitigation and health improvements, we have to collectively adopt a new mental model to use it effectively. Otherwise, it’s going to be an accelerant for a somewhat-dystopian future; not because the technology itself is problematic, but because of the structures within which it is used.

This means rethinking our values and approaches, moving away from a mindset of competition and scarcity towards one of collaboration and abundance. It may sound utopian, but only then can we harness technology’s potential to create a more just and equitable world.


Image CC BY-ND Visual Thinkery. Bryan originally created this to illustrate the concept of ‘openwashing’ but I think it also works in relation to what I’m talking about here: people pretending that there’s anything other than a big wall between the haves and the have-nots in society.

css.php