Open Thinkering

Menu

Tag: jobs

Trust no-one: why ‘proof of work’ is killing the planet as well as us

Note: subtlety ahead. This post uses cryptocurrency as a metaphor.

Painting of women working in a field. One has been cut out of the painting and is sitting in the corner of the frame, smoking.

As you may have read in the news recently, the energy requirements of Bitcoin are greater than that of some countries. This is because of the ‘proof of work‘ required to run a cryptocurrency without a centralised authority. It’s a ‘trustless’ system.

While other cryptocurrencies and blockchain-based systems use other, less demanding, cryptographic proofs (e.g. proof of stake) Bitcoin’s approach requires increasing amounts of computational power as the cryptographic proofs get harder.

As the cryptographic proofs serve no function other than ensuring the trustless system continues operating, it’s tempting to see ‘proof of work’ as inherently wasteful. Right now, it’s almost impossible to purchase a graphics card, as the GPUs in them are being bought up and deployed en masse to ‘mine’ cryptocurrencies like Bitcoin.

Building a system to be trustless comes with huge externalities; the true cost comes elsewhere in the overall system.

🏭 🏭 🏭

Let’s imagine for a moment that, instead of machines, we decided to deploy humans to do the cryptographic proofs. We’d probably question the whole endeavour and the waste of human life.

The late Dave Graeber railed against the pointless work inherent in what he called ‘bullshit jobs‘. He listed five different types of such jobs, which comprise more than half of work carried out by people currently in employment:

  1. Flunkies — make their superiors feel more important (e.g door attendants, receptionists)
  2. Goons — oppose other goons hired by other people/organisations (e.g. corporate lawyers, lobbyists)
  3. Duct Tapers — temporarily fix problems that could be fixed permanently (e.g. programmers repairing shoddy code, airline desk staff reassuring passengers)
  4. Box Tickers — create the appearance that something useful is being done when it is not (e.g. in-house magazine journalists, corporate compliance officers)
  5. Taskmasters — manage, or create extra work for, those who do not need it (e.g. middle management, leadership professionals)

What cuts across all of these is the ‘proof of work’ required to keep the status quo in operation. This is mostly obvious through ‘Box Tickers’, but it is equally true of middle management ensuring work is seen to be done (and that hierarchical systems prevail).

✅ ✅ ✅

There is much work that is pointless, and it could be argued that an important reason for this is because we have a trustless society. For example, when some of the most marginalised people in our communities ask for help between jobs, we require them to prove that they are spending 35 hours per week looking for one. It’s almost as if someone in government has taken the pithy phrase “looking for a job is a full time job” and run with it.

Western societies have been entirely captured by the classic economic argument that everything will turn out well if we all act in our own self-interest. I’m not sure if you’ve looked around you recently, but it seems to me that this model isn’t exactly… working?

It’s my belief, therefore, that we need to engender greater trust in society. Ideally, this trust should be inter-generational and multicultural, seeking to build bridges between different groups, rather than building solidarity in one group at the expense of others.

This is not a call to naivety: I’m well aware that trust comes in different shapes and sizes. What I think we’re losing, however, is an ability to trust people with small things. As a result, we’re out of practice when it comes to bigger things.

👁️ 👁️ 👁️

The Russian phrase Доверяй, но проверяй means, I believe, “trust, but verify”. It’s a useful approach to life, and an approach I use with everyone from members of my family to colleagues on various projects I’m working on.

The important thing here is the ‘trust’ part, with the occasional ‘verify’ to ensure that people don’t, well, take the piss. What we’re seeing instead is ‘verify and verify’, and increasing verificationism where we spend our lives proving who we are as well as our eligibility. This disproportionately affect already-marginalised people. It is a burden and tax on living a flourishing human existence.

🧑‍🤝‍🧑 🧑‍🤝‍🧑 🧑‍🤝‍🧑

Back in 2013, I wrote a series of blog posts reflecting on a talk by Laura Thomson entitled Minimum Viable Bureaucracy. In one of these entitled Scale, Chaordic Systems, and Trust I wrote:

You can build trust “by making many small deposits in the trust bank” which is a horse-training analogy. It’s important to have lots of good interactions with people so that one day when things are going really badly you can draw on that. People who have had lots of positive interactions are likely to work more effectively to solve big problems rather than all pointing fingers.

To finish, then, I want to reiterate two things that Laura Thomson recommended that anyone can do to build trust:

  1. Begin by trusting others
  2. Be trustworthy

Solidarity begins at home.


This post is Day 87 of my #100DaysToOffload challenge. Want to get involved? Find out more at 100daystooffload.com. Image by Banksy.

Why do we hire based on ‘experience’? HR, Automattic, and Open Badges

It’s 2016. Nobody can reasonably expect to have a ‘job for life’, or even work within the same organisation for more than a few years. As a result, you’re likely to dip into the jobs marketplace more often than your parents and grandparents did. That means it’s increasingly important to be able to prove:

  • who you are
  • what you know
  • who you know
  • what you can do

Unfortunately, hiring is still largely based on submitting a statement of skills and experience we call a ‘Curriculum Vitae’ (or résumé) along with a covering letter. This may lead to an interview and, if you like each other, the job is yours. We have safeguards in place at every step to ensure people don’t discriminate on age, gender, or postcode. Despite this, almost every part of the current process is woefully out-of-date. I’ve plenty to say about all of this, but will save most of it for another time.

In this post I’m particularly interested in why we include ‘job history’ or ‘experience’ when applying for new positions. Given that we have so little time and space to highlight everything we stand for, why do we bother including it? Academic credentials are bona fides, but job history is a bit more nebulous. Why is it still such a prominent feature of our LinkedIn profiles? Why do we email people CVs listing our ‘experience’?

Whether you think that looking at someone’s job history allows for a good ‘cultural fit’, or allows you to make assumptions about the network they bring with them, the reality is that we use job histories as a filter. They’re a useful shorthand. After all, if someone has been hired by Google or another big-name organisation, that’s a bit like saying they went to an elite university. We tend to believe in the judgments made by these kinds of organisations and institutions. We trust the filters. If the person was good enough for those organisations, we think, then they must be good enough for ours.

We like to tell ourselves that we live in a meritocratic world. If someone is good enough, so the story goes, then they can achieve the qualifications and experience necessary to get the job they want. Unfortunately, because of a combination of unconscious bias, innovation immune systems, and the new nepotism, some groups of people are effectively excluded from consideration. Don’t know the right people? Not good at interviews? Have skills too advanced or too new for qualifications to have been developed yet? Bad luck, buddy.

Another problem is that we tend to use what I call ‘chunky black box qualifications’ as proxies of the thing we’re trying to hire for. As an example, take jobs that require a degree ‘in any discipline’. What does that actually mean in practice? They want somebody who can think at a certain level, someone who is likely to come across as ‘professional’, someone who can submit work on time. However, we’re not directly looking at the assessment of the particular quality in this situation, we’re merely using an imperfect proxy.

There are many ways round the current status quo. For example, Automattic (the company behind WordPress which powers a lot of websites) does hiring very differently to the standard model. As outlined in this post, when hiring developers they test candidates in real-world situations through paid trials. In fact, as Automattic is a globally-distributed company, communication happens mainly through text. Most candidates don’t have voice conversation with anyone at the organisation until they’re hired! Obviously this wouldn’t necessarily work in every sector, but it is a good example of thinking differently: focus on what the candidate can do, not what they claim to be able to do.

Another way to approach things differently in hiring is to seek wherever possible to break down those ‘chunky black box qualifications’ into more transparent, granular, and fluid credentials.

For example, when I say I worked for Mozilla it usually piques people’s interest. I then have to go on and explain what I did during my time there. This isn’t easy given the amount of different things you do and learn in an organisation that you were with for three years. Yes, I had two different job titles, but I learned a whole load of things that would take time to tease out: working across timezones on a daily basis? Check. Learning how to use GitHub for development? Check. Consensus-based decision-making? Check.

Not every organisation is in a position to offer a trial period like Auttomatic. Nor would every individual be able to take up their offer. However, much as some people start off as consultants for organisations and then end up employed by them, there is value in getting to know people in a better way than the traditional CV and interview process allows. If we need better filters then we need smaller sieves.

For the past five years I’ve been working on Open Badges, a web-native way to issue trusted, portable, digital credentials. In the situation under consideration, I think there there are a few ways in which badges can be used to unlock those chunky black box qualifications.

  1. Granularity – instead of looking at qualifications that act as proxies, we can evidence knowledge, skills, and behaviours directly.
  2. Evidence – whereas LinkedIn profiles and CVs are a bunch of claims, Open Badges can include a bunch of evidence. Proof that someone has done something is just a click away.
  3. Portability – instead of credentials being on separate pieces of paper or in various digital silos, Open Badges can be displayed together, in context, on the web. They are controlled and displayed at the earner’s discretion.

I’m excited by the resurgence in apprenticeships and vocational education. I’m delighted to see more and more alternative ways organisations are finding to hire people. What I’m optimistic about most of all, though, is the ability for organisations to find exactly the right fit based on new forms of credentialing. It’s going to take a cultural shift in hiring, but the benefits for those who take the leap will be profound.

Image via Nomad Pictures

Some thoughts on time, performativity, and the State.

Whenever I come across a longer article via Twitter, Zite, Feedly, Google+ or the other places that I browse headlines, I add it to my Pocket account. The advantage of doing this is not only that I can read those articles at my leisure (such as when I’m on a train journey) but also that the app formats them in a way that’s actually readable.

........................................ .....................denbora berdea...

A while ago I added an article entitled Time Wars to my Pocket account. It’s by ‘leading radical blogger and professor Mark Fisher’ and is about the neo-liberal assault on time. I found it fascinating. You should go and read it.

In the UK at the moment we have the situation where the government has declared war on public sector pay and pensions. It’s dressed up to look like something different, of course, but even a quick peek behind the curtains reveals how ministers manipulate the levers in a futile attempt to make taxpayer-funded institutions cost the government less.

Unfortunately, the ideology of the Conservative government (let’s face it, the Liberals aren’t doing much despite their coalition) is predicated upon a lazy idea of the market as the solution to every problem facing society. Climate change? Carbon trading! NHS costs rising? Bring in private providers! Educational ‘standards’ not improving fast enough? De-regulate everything!

The logic of Capital is everywhere. One very prominent and obvious effect of this is the increasingly casualised and temporary jobs on offer. Who has a permanent job with a guaranteed final salary pension these days? Which of us spend more than five years with the same employer? Where are the ‘good’ jobs (the ones that my Grandmother talks about) for graduates?

At the most simple level, precarity is one consequence of the “post-Fordist” restructuring of work that began in the late 1970s: the turn away from fixed, permanent jobs to ways of working that are increasingly casualised. Yet even those within relatively stable forms of employment are not immune from precocity. Many workers now have to periodically revalidate their status via systems of “continuous professional development”; almost all work, no matter how menial, involves self-surveillance systems in which the worker is required to assess their own performance. Pay is increasingly correlated to output, albeit an output that is no longer easily measurable in material terms.

Of course, there are massive benefits to the casualisation of labour. For example, I now work variable hours from home as part of a team that spans at least five timezones. I get to choose when to take my holidays. My performance is based upon my output rather than the number of hours I spend at my desk.

But, there’s a creeping performative element to all of this. When you can work any time of the day, it’s tempting to work   more, not less – especially when you’re dealing with things you’re interested in. I’m fortunate in that I work for Mozilla, whose politics and communitarian approach correlate strongly with my own. But if I didn’t work for a non-profit (or a forward-thinking organisation such as Valve) then I think I’d be looking over my shoulder all the time. Self-regulation and censorship, as George Orwell showed in 1984 is regulation and censorship of the worst kind.

The casualisation of labour is great for those working in what is loosely (and imprecisely) defined as ‘the knowledge economy’. Give me a laptop and an internet connection and I can work anywhere. Others, however, depend upon being physically co-located with others to earn their money. Whilst the uncertainty that goes hand-in-hand with casualisation is great for those working in the knowledge economy, it’s a definite downside to those who can’t decide where and when they’re going to work. In fact, all they get is the downside, the uncertainty.

Uncertainty is a negative side effect that some of us are willing to live with because of the positives on the flip side of the coin. But that flip side largely doesn’t exist for those who rely on physical co-location to do their jobs. I’m thinking teachers. I’m thinking doctors and nurses and hospital staff. I’m thinking pretty much every job in the public sector. These aren’t occupations that we should be looking to casualise: we should be making people in these positions feel more secure, not less:

The neoliberal attacks on public services, welfare programmes and trade unions mean that we are increasingly living in a world deprived of security or solidarity. The consequence of the normalisation of uncertainty is a permanent state of low-level panic. Fear, which attaches to particular objects, is replaced by a more generalised anxiety, a constant twitching, an inability to settle.

Everything that can be outsourced to the market in our brave new Big Society is packaged up and sold to the highest bidder. Witness the G4S Olympic security debacle, for example. At the same time, training and career development is also outsourced to the market. Instead of taxpayer-funded institutions such as hospitals and schools developing and keeping experienced, knowledgeable staff we’re increasingly faced with uncertain, temporary workers representing third-party organisations. Any ‘innovation’ within such organisations by necessity has to be top-down, as the mechanisms for grassroots innovation are stymied by HR practices:

The reality, however, is that innovation requires certain forms of stability. The disintegration of social democracy has had a dampening, rather than a dynamic, effect on culture in highly neoliberalized countries such as the UK. Frederic Jameson’s claims that late capitalist culture would be given over to pastiche and retrospection have turned out to be extraordinarily prophetic.

I’m not arguing for full communism now. Nor am I advocating a King Canute-style position against the incoming tide. What I am questioning, however, is whether the logic of Capital and private enterprise should be applied to the institutions of our state. Some things, after all, are public goods.

I’ll end where Mark Fisher’s article starts, commenting only that we live in an increasingly polarised society where the haves get to choose what the have-nots get to do with their time:

Time rather than money is the currency in the recent science fiction film In Time. At the age of 25, the citizens in the future world the film depicts are given only a year more to live. To survive any longer, they must earn extra time. The decadent rich have centuries of empty time available to fritter away, while the poor are always only days or hours away from death.

Go and read the article. It’s worth it, trust me. 🙂

Image CC BY-SA Mr. Theklan

css.php