The Automatic State?

(by Adam Elkus. I will be guest posting occasionally at Zenpundit. I am a PhD student in Computational Social Science at George Mason University, and a blogger at The Fair Jilt, CTOVision, Analyst One, and my own blog of Rethinking Security. I write a column for War on the Rocks, and I once was a blogger at Abu Muquwama. You can follow me on Twitter here. )

I’ve been looking at some recurring themes regarding technocracy, control, elites, governance in debates surrounding the politics of algorithms, drone warfare, the Affordable Healthcare Act, and big data‘s implications for surveillance and privacy. Strangely enough, I thought of James Burnham.

Paleoconservative writer Burnham’s scribblings about the dawn of a “managerial revolution” gave rise to conservative fears about a “managerial state,” governed by a technocratic elite that utilizes bureaucracy for the purpose of social engineering. In Burnham’s original vision (which predicted capitalism would be replaced by socialism), the dominant elites were “managers” that controlled the means of production. But other conservative political thinkers later expanded this concept to refer to an abstract class of technocratic elites that ruled a large, bureaucratic system.

Burnham had a different vision of dystopia than George Orwell, who envisioned a rigid tyranny held together by regimentation, discipline, pervasive surveillance, and propaganda. Rather, the managerial state was an entity that structured choice. The conception of power that Burnham and others envisioned issued from dominance of the most important industrial production mechanisms, and the bureaucratic power of the modern state to subtly engineer cultural and political outcomes. Building on Burnham and those he influenced, one potential information-age extension of the “managerial” theory is the idea of the “automatic state.”

Automatic state is a loose term that collects various isolated ideas about a polity in which important regulatory powers are performed by computational agents of varying intelligence. These beliefs eschew the industrial-era horror of a High Modernist apocalypse of regimentation, division of labor, social engineering, and pervasive surveillance. The underlying architecture of the automatic state, though, is a product of specific political and cultural assumptions that influence design. Though assumed to be neutral, the system automatically, continuously, and pervasively implements regulations and decision rules that seek to shape, guide, and otherwise manipulate social behavior.

Indeed, a recurring theme in some important political and social debates underway is that changes in technology allow a small group of technocrats to control society by structuring choices. The data signatures that all individuals generate and the online networks they participate is a source of power for both the corporate and government worlds. The biases of algorithms is a topic of growing interest. Some explicitly link unprecedented ability to collect, analyze, and exploit data with enhanced forms of violence. Others argue that the ability to record and track large masses of data will prop up authoritarian governments.  Activists regard the drone itself–and the specter of autonomous weapons–as a robotic symbol of imperialism.

While an automatic state may be plausible elsewhere, the top-down implications of Burnham’s technocracy does not fit America fairly well. Some of the most prominent users of the relevant automatic state technologies are corporations. While cognitive delegation to some kind of machine intelligence can be seen in everything from machine learning systems  to airplane autopilot functions, it would be a big stretch to say that the powerful algorithms deployed in Silicon Valley and Fort Meade serve a macro-level social regulatory function.

Page 1 of 3 | Next page