Saturday, December 14, 2024

The Unaccountability Machine

 
 
The Unaccountability Machine
Why Big Systems Make Terrible Decisions - And How the World Lost its Mind
by Dan Davies
2024
 
 
The Unaccountability Machine starts with with author Dan Davies introducing the relatively accessible idea of 'accountability sinks.' The paradigmatic example of an accountability sink is a gate agent at the airport. Your flight is delayed or canceled, your plans are upended, you might incur significant costs, and the only person you have access to, to complain about all this, is the gate agent, who has no control over the situation, no ability to change your fate, and possibly no way to escalate your complaint to a higher level, beyond a direct supervisor who can only remind you that all this is company policy, and can you please stop being rude to the human shield?
 
Davies, an economist by training, who used to work for the Bank of England, and who is clearly trying to translate complex academic and legal ideas into accessible prose that can be understood by a broad public audience, makes two moves from there. First, he diagnoses this kind of breakdown in accountability, when a problem is caused by 'a decision no one made' with no one to blame or even complain to, as a problem of organizational structure, where there is a break in what should be a feedback loop, and the acted-upon have no way to communicate with the actors and the people who decide how they should act. Second, he broadens outward to a claim that broken feedback loops and the absence of accountability are why we live in a society where nothing seems to work, and for example, the biggest banks in the country can crash the global economy by selling bad mortgages that make people lose their homes, allegedly without any crimes being committed.
 
The rest of the book is Davies' intellectual history of how we got here. I found this account interesting, and I'm going to give a basic outline in a moment, but I kind of disagree with his equation of certain things. I think there are meaningful differences between companies using their lowest level employees as accountability sinks, the unintended consequences of 'decisions no one made,' and the inability (or I think more accurately, the refusal) of our law-enforcement to punish executives for the harms caused by the companies they lead.
 
Davies thinks that unaccountability is an unavoidable but accidental consequence of certain organizational structures. I would argue that assigning (or withholding) accountability and blame is always a choice. Executives may deliberately set up their 'customer service' points to avoid actually hearing from their customers, and may generally structure their companies to allow profitable law-breaking while avoiding the creation of incriminating evidence because they want to avoid both public scrutiny and prosecution, but it's not actually impossible, or even unfair, to hold them accountable - to blame and punish them. That's a choice, and it could be made differently.
 
There's also a difference between true 'decisions no one made' and when decision-makers try to hide their role to escape accountability. Things like central banks worldwide setting 2% as their inflation target happen because of what sociologists DiMaggio and Powell called 'institutional isomorphism' - the leaders of one organization copying another until the copied choice becomes an unofficial standard by default. Situations like that can arise by accident, but we're not necessarily just stuck with them. People can meet to discuss, negotiate, and change the standard. They can do this more than once, even routinely! But that's different from Boeing executives showing callous disregard for safety by rushing crash-prone MAX jets to market, and I think Davies errs when he conflates the two.
 
When companies decide to seek profit in ways they know will cause harm or even death, to their workers, customers, or just unfortunate bystanders, that too is a choice. When a leader, whether political or organizational, decides that they have a right to act, and that the acted-upon do not have or deserve a right to have a say in the decisions that affect them, that is also a choice, and an ideology, not an accident. And, as recent events have reminded me, while a government or an organization can made it impossible to give meaningful feedback through official channels, people still retain the ability to go around those channels, as whistleblowers, or protesters, or through direct action.
 
Anyway, those critiques aside, I was interested in the intellectual history Davies presented to explain how we got here. In brief, after WWII, two different schools of thought developed theories if how organizations work and how they ought to behave - the interdisciplinary field of cybernetics, and the newly invigorated discipline of economics. Cybernetics accidentally undid itself by developing the information theory behind modern computing (which then proved to be an attractive alternative career), while economics embedded itself in government and industry, eventually providing both the instructions and intellectual justification for neoliberalism and rising inequality. In Davies' telling, the decisive victory of economics (and if he's lucky enough that his book gets a Big Short style Hollywood adaptation, the moment that will definitely be the climax of the movie) occurs when the democratic Chilean government of Salvador Allende, advised by cybernetician Stafford Beer, is overthrown by dictator Augusto Pinochet, backed by the CIA and advised by economist Milton Friedman, on September 11th, 1973. Afterward, we get Reagan and Thatcher, private equity conducting leveraged buy-outs, and the general rise in corporate leaders focusing on quarterly profits at the expense of long-term sustainability, which culminates in the Great Recession in 2008, and the post housing bubble world of today, where nothing works but no one is to blame.
 
Today when we think of the term 'cybernetics,' we think of human-machine hybrids, but as an intellectual field, it refers to the study of complex systems, including the ways that they regulate themselves in response to a changing environment. The paradigmatic cybernetic process is the feedback loop - the system acts based on input from its environment, its actions change the environment, and that change becomes an input that affects the next action. A stable system helps to maintain its own environment, unstable systems eventually destroy themselves by changing the environment in ways that are not survivable. Think of an ecosystem.
 
Many disciplines contributed to cybernetic thought, including medicine, computing, and sociology. One of my favorite theory books, Anthony Giddens' Modernity and Self-Identity is all about the 'reflexivity' of modern organizations, the way they collect data and change their behavior in response to it - that is, the way they incorporate feedback loops. In the 1940s through 60s it seems to have been a respectable field, but ironically, the great success of some of the projects led to the coalition separating back into its component parts.
 
I think Davies is at his best when he's explaining this part, which is unfamiliar to him too. His discussion of economics benefits from his insider-knowledge, but it felt like he assumed a bit too much familiarity with terms and concepts, despite his efforts to make the book accessible to non-expert readers.

1 comment:

  1. Cybernetics in the original sense as you describe here is something I've been fascinated with for a long time, especially the works of Norbert Weiner. This book sounds super interesting!

    ReplyDelete