Open Thinkering

Menu

Tag: Bonfire

Exploring the sweet spot for Zappa project approaches to misinformation

Venn diagram showing three overlapping circles (Technical, Procedural, and Relational)

At the overlap of all three circles is the words 'Zappa Project'

At the overlap of Technical and Relational is the word 'Consistency'

At the overlap of Procedural and Relational is the word 'Reliability'

At the overlap of Technical and Procedural is the word 'Efficiency'

The Bonfire logo is at the bottom of the graphic, with the version number 0.1

While we know that misinformation is not a problem that can ever be fully ‘solved’, it is important to try and reduce its harms. Last week, we published the first version of a report based on user research as part of the Zappa project, an initiative from the Bonfire team.

This week, I’ve narrated a Loom video giving an overview of the mindmap embedded in the report. This was requested by Ivan, as he found that the way that I explain it captures some nuance that perhaps isn’t in the report (which is more focused on recommendations).

Another thing I was tasked with this week was creating a Venn diagram from the three types of approaches that could be taken for the technical development for the Zappa project. These were slightly tweaked from suggestions made by one of our user research participants. As you can see in the above diagram, these are:

  • Technical — improving the way that users interact with information
  • Procedural — improving the process of capturing and displaying information
  • Relational — improving the way that humans interact with one another

It’s unlikely that any one approach would sit squarely as being only one type of approach. For example, spending time thinking about the way that information is presented to users and allowing them to control that sits right in the middle of all three.

There are three overlaps other than the one right in the middle. These are:

  • Technical / Procedural — we’ve currently labelled this intersection as Efficiency as using technical approaches to improve processes usually makes them more efficient. This might include making it easier to block certain types of posts, for example.
  • Procedural / Relational — we’ve labelled this intersection as Reliability because process when considered in terms of relationships is often focused on repeatable patterns. This might include, for example, being able to validate that the account you’re interacting with hasn’t been hijacked.
  • Relational / Technical — we’ve used the label Consistency for this one as one of the things we found from our research is that users are often overwhelmed by information. We can do something about that, so this might include helping users feel in charge of their feeds to help avoid context collapse or aesthetic flattening.

You will notice the version number appended to this diagram is ‘v0.1’. It might be that we haven’t found the right words for these overlaps. It might be that some are more important than others. We’d love feedback from anyone paying attention to the Zappa project, whether you’ve been following the work around Bonfire since the beginning, or whether this is the first you’re hearing of it.

If it helps, feel free to grab the Google Slides original of the above Venn diagram, or comment below on a) what you think is good, b) anything you have questions about, or c) anything that concerns you.

First version of report published sharing findings from Zappa project user research

Illustration of a bear, fox, badger, and squirrel looking into a fire. Taken from the Bonfire website.

Over the last month, I’ve been working with the Bonfire team to perform some initial user research on the Zappa project. The aim of the project, funded by a grant from the Culture of Solidarity Fund, is to empower communities with a dedicated tool to deal with the coronavirus “infodemic” and online misinformation in general.

We ended up speaking with 11 individuals and organisations, and have synthesised our initial user research into the first version of a report which is now available.

📄 Click here to access the report

Countering misinformation in federated social networks: an introduction to the Zappa project

Illustration of birds from Bonfire website

One thing I’ve learned from spending all of my adult life online and being involved in lots of innovation projects is that you can have the best bookmarking system in the world, but it means nothing if you don’t do something with the stuff you’ve bookmarked. Usually, for me, that means turning what I’ve filed away into some kind of blog post. It’s basically the reason Thought Shrapnel exists.


Last week I started some new work with the Bonfire team called the Zappa project. Bonfire is a fork of CommonsPub, the underlying codebase for MoodleNet.

Self-host your online community and shape your experience at the most granular level: add and remove features, change behaviours and appearance, tune, swap or turn off algorithms. You are in total control.

Bonfire is modular, with different extensions allowing communities to customise their own social network. The focus of Zappa is shaped by a grant from the Culture of Solidarity Fund.

The grant will be used to release a beta version of Bonfire Social and to develop Zappa – a custom bonfire extension to empower communities with a dedicated tool to deal with the coronavirus “infodemic” and online misinformation in general.

The announcement blog post talks of “experimental artificial intelligence engines” and “Zappa scores” which may be longer-term goals, while my job is to talk to people with real-world needs right now. As I’ve learned from being involved in quite a few innovation projects over the last 20 years, there’s a sweet spot between what’s useful, theoretically sound, and technically achievable.


Last week, I met with Ivan to try and do some definition of user groups and the initial scope of the project. It’s easy to think that the possible target audience is ‘everyone’ but it’s of much more value to think about who the Zappa project is likely to be useful for in the near future.

Priority areas for stakeholders, user groups, and themes

The above Whimsical board shows:

  • a list of people we can/should speak to (we’ve spoken with two orgs so far)
  • themes of which we should be aware/cognisant
  • groups of people we should talk with

The latter two lists are prioritised based on our current thinking and, as you can see, it’s biased towards action, towards those who don’t have merely an academic interest in the Zappa project, but who have some skin in the anti-misinformation game.


A note in passing: many people use ‘misinformation’ and ‘disinformation’ as near-synonyms of one another. But, even in common usage, it’s clear that they have an important difference in meaning.

We’d say, for example, that someone was ‘misinformed’, in which case their lack of having the correct information wouldn’t necessarily be their fault. On the other hand, we might talk about state actors waging a ‘disinformation’ campaign, which very much would be intentional, and probably focused on creating a mixture of fear, uncertainty, and/or doubt.

The line between misinformation and disinformation can be blurry, but it’s probably helpful to conceptualise what we’re doing in the terms of the grant: to help “empower communities with a dedicated tool to deal with the coronavirus ‘infodemic’ and online misinformation in general”.


One of the resources that I’ve found particularly helpful (and which I wish I’d seen before presenting on Truth, Lies & Digital Fluency a couple of years ago) is Fake news. It’s complicated. Its author, Claire Wardle from First Draft, lays out 7 Types of Mis- and Disinformation on a spectrum from ‘satire or parody’ (which some wouldn’t even conceptualise as misinformation) through ‘fabricated content’ (which most people would definitely consider disinformation).

7 Types of Mis- and Disinformation

Some of the differences between these types can be quite nuanced, and so I found the Misinformation Matrix in the post really useful for looking at the reasons for the misinformation being published in the first place. These range from sloppy journalistic practices, through to flat-out propaganda.

Misinformation Matrix

What the user research we’re doing at the moment is focused upon is what types of misinformation human rights organisations, scientists, and other front-line orgs are suffering from, how and where these are manifested, and what they’ve tried to do about it.

So far, we’ve discovered that countering misinformation can be a huge time suck for people who are often volunteering for charities, non-profits, or loosely-organised groups. It seems that some areas of the world seem to suffer more than others, and particular platforms are currently doing worse than others. All of them could, of course, could do much better.


We’re still gathering people and organisations for this project. So if, based on the above, you know someone who you think it might help us to talk to, then please get in touch! You can leave a comment below, or get in contact via email.

css.php