(by Adam Elkus. I will be guest posting occasionally at Zenpundit. I am a PhD student in Computational Social Science at George Mason University, and a blogger at The Fair Jilt, CTOVision, Analyst One, and my own blog of Rethinking Security. I write a column for War on the Rocks, and I once was a blogger at Abu Muquwama. You can follow me on Twitter here. )
I’ve been looking at some recurring themes regarding technocracy, control, elites, governance in debates surrounding the politics of algorithms, drone warfare, the Affordable Healthcare Act, and big data‘s implications for surveillance and privacy. Strangely enough, I thought of James Burnham.
Paleoconservative writer Burnham’s scribblings about the dawn of a “managerial revolution” gave rise to conservative fears about a “managerial state,” governed by a technocratic elite that utilizes bureaucracy for the purpose of social engineering. In Burnham’s original vision (which predicted capitalism would be replaced by socialism), the dominant elites were “managers” that controlled the means of production. But other conservative political thinkers later expanded this concept to refer to an abstract class of technocratic elites that ruled a large, bureaucratic system.
Burnham had a different vision of dystopia than George Orwell, who envisioned a rigid tyranny held together by regimentation, discipline, pervasive surveillance, and propaganda. Rather, the managerial state was an entity that structured choice. The conception of power that Burnham and others envisioned issued from dominance of the most important industrial production mechanisms, and the bureaucratic power of the modern state to subtly engineer cultural and political outcomes. Building on Burnham and those he influenced, one potential information-age extension of the “managerial” theory is the idea of the “automatic state.”
Automatic state is a loose term that collects various isolated ideas about a polity in which important regulatory powers are performed by computational agents of varying intelligence. These beliefs eschew the industrial-era horror of a High Modernist apocalypse of regimentation, division of labor, social engineering, and pervasive surveillance. The underlying architecture of the automatic state, though, is a product of specific political and cultural assumptions that influence design. Though assumed to be neutral, the system automatically, continuously, and pervasively implements regulations and decision rules that seek to shape, guide, and otherwise manipulate social behavior.
Indeed, a recurring theme in some important political and social debates underway is that changes in technology allow a small group of technocrats to control society by structuring choices. The data signatures that all individuals generate and the online networks they participate is a source of power for both the corporate and government worlds. The biases of algorithms is a topic of growing interest. Some explicitly link unprecedented ability to collect, analyze, and exploit data with enhanced forms of violence. Others argue that the ability to record and track large masses of data will prop up authoritarian governments. Activists regard the drone itself–and the specter of autonomous weapons–as a robotic symbol of imperialism.
While an automatic state may be plausible elsewhere, the top-down implications of Burnham’s technocracy does not fit America fairly well. Some of the most prominent users of the relevant automatic state technologies are corporations. While cognitive delegation to some kind of machine intelligence can be seen in everything from machine learning systems to airplane autopilot functions, it would be a big stretch to say that the powerful algorithms deployed in Silicon Valley and Fort Meade serve a macro-level social regulatory function.
Certainly it is clear that mastery of computational intelligence’s commercial applications has made a new Californian commercial elite, but it is mostly not interested in governance. Faulty government information technology deployment of large-scale systems (as seen in the Obamacare debacle) also does not auger well for an American automatic state elite. However, some interesting — and troubling — possibilities present themselves at state, country, and municipal levels of governance.
Cash-strapped state governments seeking more precise ways of extracting tax revenue for road projects are seeking to put a mile-tracking black box in every car. Drivers would be charged based on a pay-per-mile system, and government planners hope that it can better incentivize certain driving patterns. Tools like the black box may suggest the dawn of a new age of revenue extraction enabled by cheap, precise, and persistent surveillance. Why not, say, utilize a black box which (in the manner of a traffic camera) automatically fines the driver for going over the speed limit or violating a traffic regulation?
In contrast to Burnham’s vision of technocratic elites, those who benefit from these technologies are the same unwieldy group of local bureaucrats that Americans must inevitably put up with every time they drudge down to their local DMV. While this may seem innocuous, local government’s thirst for new revenue has led to disturbing practices like the drug war habit of asset forfeiture. Though legal, asset forfeiture has stimulated corruption and also incentivized constant drug raiding in order to secure more funds.
What technologically-enhanced local governments may bring is the specter of automatic and pervasive enforcement of law. The oft-stated notion that millions of Americans break at least several laws every day suggests why automatic and pervasive enforcement of rules and regulations could be problematic. As hinted in the previous reference to asset forfeiture, it is not merely a question of a rash reaction to substantial fiscal problems that local political elites face.
Politics is a game of incentives, and it is also a question of collective action and cooperation. As many people noted in analysis of mayoral corruption in the District of Columbia, many local politicians often have little hope of advancing to higher levels of prominence. Thus, they have much less incentive to delay gratification in the hope that a clean image will help them one day become more important. They can either reward themselves while they have power, or forfeit the potential gains of public office. Second, the relative autonomy of state and local governments is possible due to the lack of a top-down coordination mechanism seen in other, more statist political systems. The decision horizon, of, say, a county police department is extremely limited. So it will be expected to advocate for itself, regardless of the overall effect. These mechanisms are worsened by the fiscal impact of government dysfunction, the decay of infrastructure, privatization, and the limited resources increasingly available to state and local governments.
This mismatch is somewhat understandable, given the context of Burnham’s original theory. His inspiration was the then-dominant corporatist models seen in 1930s Germany, the Soviet Union, Italy, and other centrally planned industrial giants. He also misunderstood the political importance of the New Deal, claiming it was a sign of American transformation to a managerial state. As Lynn Dumenil noted in her history of interwar America (and her own lectures I attended as an undergrad), the New Deal was not a complete break from Herbert Hoover’s own conception of political economy. Hoover envisioned a form of corporatist planning in which the biggest corporate interests would harmoniously cooperate regarding the most important political-economic issues of the day,with the government as facilitator. The technocratic corporatism implied by Hoover’s vision was Burnham-like, and the New Deal was a continuation of this model. It differed only in that it made the government the driver of industrial political economy instead of designer and facilitator.
However, sustainment of a New Deal-like corporatist model depends on elite agreement. This was not to last. George Packer, Chris Hayes, and Peter Turchin have all noted that today’s American elites do not have the level of cohesion necessary to sustain a technocratic state. Instead, they are competing with each other in a zero-sum manner. Silicon Valley entrepreneurs have flirted with the idea of secession. The US government cannot pass a budget that funds the government for more than a few months. A “submerged state” of sub rosa government regulations twists policy towards an affluent few and private interests. The notion that financial regulation was compromised by regulatory capture is not controversial. Finally, a normative conception of elite appropriateness is no longer shared.
What this all suggests is that the impact of an automatic state will be scattered and aggregate. It will be experienced in large part locally through revenue-extracting technologies open up hitherto untapped sources of advantage. Political rent-seeking, not social engineering is the byword. The mechanism of extracting rents, however, is very “managerial” in its operation. In my home state of California, overt attempts to increase revenue have been consistently thwarted by political resistance. The potential for automatic state technologies to become “political technology” that fixes this problem through much less obvious revenue extraction mechanisms is understandably very attractive. However, the ability to process a constant stream of data from automatic state technologies will be contingent on computational power available, which will vary contextually.
Where the automatic state becomes politically and culturally influencing beyond pure rent extraction is also an area where localism will likely matter more. Computational capabilities for automatic enforcement and subtle structuring of political choice is difficult to accomplish on a national level except on a fairly piecemeal way due to national political constraints. However, on a local level where one party or interest may have vastly less constraining influences, it is much more likely that a computational instantiation designed to structure cultural or political choices toward a preferred result could occur. Even without such partisan considerations, there is always a school district that acts to ban a student’s behavior that they dislike or a prosecutor seeking to ramrod a given result that would see such technology as a boon.
All of this isn’t to completely dismiss the potential for federal usage of these technologies. But, as seen in the NSA scandal, mass domestic surveillance in an environment where the public is not afraid of a 9/11-esque event occurring may not be politically sustainable in its current form. A patchwork of “Little Brothers” tied to a revenue extraction mission, however, is a far more diffuse and difficult political issue to organize around.
If the automatic state comes, it is not likely that it will come in the form of a Big Brother-like figure hooked up to a giant machine. Rather, it might very well be a small black box in your car that measures your mileage–and is so successful that it is soon modified to track your speed and compliance with traffic regulations.