loader image

The Risks of Accountability Sinks

by | Dec 3, 2024 | Articles, Second Edition

Donald Marshall reviews The Unaccountability Machine: Why Big Machines Make Terrible Decisions, and How the World Lost its Mind by Dan Davies

 

 

Dan Davies’ book opens with a memorable, and horrifying, story. In 1999 a shipment of 440 squirrels arrived at Amsterdam’s Schiphol Airport from Beijing. Unfortunately, the shipment didn’t have the proper documentation, and so the squirrels could not be sent on to their intended recipients. Staff working for the airline that had shipped them followed the government-set policy for such a scenario, which was either to return animals to their place of origin or euthanise them. 

 

This policy, presumably, had been put in place to deter people from shipping animals without the right paperwork, but it produced an unfortunate outcome when implemented. For some reason the squirrels could not be returned to China and as a result the airline staff killed them in an industrial animal shredder. This led to public outrage and, eventually, an inquiry. The airline apologised. In its apology it noted that staff had actually followed the policy correctly, but said that they should have recognised that what they were expected to do was unethical, and that they should have deviated from it.  

 

The policy that the airline staff followed is an example of what Davies calls an ‘accountability sink’. An accountability sink is a policy or process that means decisions can be taken without anyone being clearly accountable. In this case, it would be hard to say that the people who made the policy, the people who implemented it or the people running the airline were strictly to blame for what happened. 

 

Davies traces the history of this idea to a movement called Cybernetics, a precursor of systems thinking, and in particular to the writings of Stafford Beer, one of its key architects. Beer wrote about the role of information and control in organisations. Accountability sinks exist, he explained, when all the possible information that a complex environment might produce is simplified into a single rule, such as a policy that can be followed and which cannot easily be overridden (even when some unexpected situation means that it probably should be). The risk of this is that decisions made that don’t fit their specific situation can harm the people on the receiving end of accountability sinks, and they become disenchanted.

 

Most of the book focuses on how this plays out in the private sector but Davies also discusses examples from government. The Civil Service is in the business of designing and implementing policies, and policies often function as an accountability sink to some degree because they are based on a simplified model of the world that decision makers can follow. Policies to guide decisions in contentious areas like planning, immigration and sentencing tend to reduce the individual accountability of the people making the decisions. 

 

This on its own doesn’t mean policies are bad, though. Davies rightly acknowledges that having a general policy, rather than relying on individual decision makers, often leads to better outcomes. Policies can make decisions more consistent, transparent and fairer. They allow governments to make decisions when there is no way to consider all the relevant information each time a decision is made. Unlike a company, the Government often has no choice about the areas in which it operates; if it is particularly challenging to make decisions in a particular area it can’t just shift to a different market. Policies place some order around the trickiest issues. 

 

The risk, of course, is that policies or systems become detached from their original purpose and lack any mechanism to adapt. In 1981, for example, the head of the office of French civil servant Guy Abeille asked him to come up with a rule for how high the fiscal deficit should be relative to GDP; his boss wanted a plausible way to say “No” to ministers asking for money. The French fiscal deficit at the time was around 2% of GDP, so Abeille proposed that it should not exceed 3%. This rule was then applied in France for the next ten years. In 1995, it was incorporated into the EU’s Stability and Growth Pact, and played a major role in the decisions imposed on countries in the Eurozone following the 2008 crash. Davies argues that this is an example of a government accountability sink, whereby a policy enabled difficult decisions to be made with reduced accountability, for better or worse. 

 

What lessons should civil servants take from the book? Davies does end on a positive note – the way to avoid the negative aspects of accountability sinks is to ensure systems listen to feedback from the people on whom they have an impact. Politicians and civil servants should design policies with metaphorical ‘red handles’, like the ones used to stop trains in an emergency, which can be used to alert the managers of the policy to a problem, such as an unexpected situation not considered when the policy was designed.

 

In practice, this might mean providing a way for decision makers on the ground (or the people affected by their decisions) to report major concerns directly to a senior manager with clear accountability for the policy. Civil servants should also devote time to observing and analysing the practical results of the policies they oversee. 

 

Stafford Beer’s teenage son built a prototype in the family garage of a machine which would allow every household in a country to provide regular feedback on how its members felt. The problem then would have been what to do with all the information; nowadays, social media combined with data science and artificial intelligence provide a plausible way of getting near-instant mass feedback, summarised for the consumption of managers. If this sounds like a data dashboard that’s not a coincidence – Stafford Beer invented the concept of a dashboard with metrics. 

 

The Unaccountability Machine is a provocative and worthwhile read for anyone interested in what happens when organisations have to deal with increasing complexity in their environment, and a reminder that it is crucial to build in the right degree of accountability.

 

Donald Marshall is the Team Leader for Strategy, Projects and Analysis in the Scotland Office.

Emergency brake

Want more?

Sign up to be notified when more content is released.

By clicking subscribe above you are agreeing to receive occasional informational emails about Heywood Quarterly and the Heywood Foundation, including updates about our services and invitations to participate in special events or surveys. You can unsubscribe at any time.

The Heywood Foundation is committed to protecting your personal information. Read more about our commitment to your privacy.