zenpundit.com » Blog Archive » The Automatic State?

The Automatic State?

(by Adam Elkus. I will be guest posting occasionally at Zenpundit. I am a PhD student in Computational Social Science at George Mason University, and a blogger at The Fair Jilt, CTOVision, Analyst One, and my own blog of Rethinking Security. I write a column for War on the Rocks, and I once was a blogger at Abu Muquwama. You can follow me on Twitter here. )

I’ve been looking at some recurring themes regarding technocracy, control, elites, governance in debates surrounding the politics of algorithms, drone warfare, the Affordable Healthcare Act, and big data‘s implications for surveillance and privacy. Strangely enough, I thought of James Burnham.

Paleoconservative writer Burnham’s scribblings about the dawn of a “managerial revolution” gave rise to conservative fears about a “managerial state,” governed by a technocratic elite that utilizes bureaucracy for the purpose of social engineering. In Burnham’s original vision (which predicted capitalism would be replaced by socialism), the dominant elites were “managers” that controlled the means of production. But other conservative political thinkers later expanded this concept to refer to an abstract class of technocratic elites that ruled a large, bureaucratic system.

Burnham had a different vision of dystopia than George Orwell, who envisioned a rigid tyranny held together by regimentation, discipline, pervasive surveillance, and propaganda. Rather, the managerial state was an entity that structured choice. The conception of power that Burnham and others envisioned issued from dominance of the most important industrial production mechanisms, and the bureaucratic power of the modern state to subtly engineer cultural and political outcomes. Building on Burnham and those he influenced, one potential information-age extension of the “managerial” theory is the idea of the “automatic state.”

Automatic state is a loose term that collects various isolated ideas about a polity in which important regulatory powers are performed by computational agents of varying intelligence. These beliefs eschew the industrial-era horror of a High Modernist apocalypse of regimentation, division of labor, social engineering, and pervasive surveillance. The underlying architecture of the automatic state, though, is a product of specific political and cultural assumptions that influence design. Though assumed to be neutral, the system automatically, continuously, and pervasively implements regulations and decision rules that seek to shape, guide, and otherwise manipulate social behavior.

Indeed, a recurring theme in some important political and social debates underway is that changes in technology allow a small group of technocrats to control society by structuring choices. The data signatures that all individuals generate and the online networks they participate is a source of power for both the corporate and government worlds. The biases of algorithms is a topic of growing interest. Some explicitly link unprecedented ability to collect, analyze, and exploit data with enhanced forms of violence. Others argue that the ability to record and track large masses of data will prop up authoritarian governments.  Activists regard the drone itself–and the specter of autonomous weapons–as a robotic symbol of imperialism.

While an automatic state may be plausible elsewhere, the top-down implications of Burnham’s technocracy does not fit America fairly well. Some of the most prominent users of the relevant automatic state technologies are corporations. While cognitive delegation to some kind of machine intelligence can be seen in everything from machine learning systems  to airplane autopilot functions, it would be a big stretch to say that the powerful algorithms deployed in Silicon Valley and Fort Meade serve a macro-level social regulatory function.

Certainly it is clear that mastery of computational intelligence’s commercial applications has made a new Californian commercial elite, but it is mostly not interested in governance. Faulty government information technology deployment of large-scale systems (as seen in the Obamacare debacle) also does not auger well for an American automatic state elite. However, some interesting — and troubling — possibilities present themselves at state, country, and municipal levels of  governance.

Cash-strapped state governments seeking more precise ways of extracting tax revenue for road projects are seeking to put a mile-tracking black box in every car. Drivers would be charged based on a pay-per-mile system, and government planners hope that it can better incentivize certain driving patterns. Tools like the black box may suggest the dawn of a new age of revenue extraction enabled by cheap, precise, and persistent surveillance. Why not, say, utilize a black box which (in the manner of a traffic camera) automatically fines the driver for going over the speed limit or violating a traffic regulation?

In contrast to Burnham’s vision of technocratic elites, those who benefit from these technologies are the same unwieldy group of local bureaucrats that Americans must inevitably put up with every time they drudge down to their local DMV. While this may seem innocuous, local government’s thirst for new revenue has led to disturbing practices like the drug war habit of asset forfeiture. Though legal, asset forfeiture has stimulated corruption and also incentivized constant drug raiding in order to secure more funds.

What technologically-enhanced  local governments may bring is the specter of automatic and pervasive enforcement of law. The oft-stated notion that millions of Americans break at least several laws every day suggests why automatic and pervasive enforcement of rules and regulations could be problematic. As hinted in the previous reference to asset forfeiture, it is not merely a question of a rash reaction to substantial fiscal problems that local political elites face.

Politics is a game of incentives, and it is also a question of collective action and cooperation. As many people noted in analysis of mayoral corruption in the District of Columbia, many local politicians often have little hope of advancing to higher levels of prominence. Thus, they have much less incentive to delay gratification in the hope that a clean image will help them one day become more important. They can either reward themselves while they have power, or forfeit the potential gains of public office. Second, the relative autonomy of state and local governments is possible due to the lack of a top-down coordination mechanism seen in other, more statist political systems. The decision horizon, of, say, a county police department is extremely limited. So it will be expected to advocate for itself, regardless of the overall effect. These mechanisms are worsened by the fiscal impact of government dysfunction, the decay of infrastructure, privatization, and the limited resources increasingly available to state and local governments.

This mismatch is somewhat understandable, given the context of Burnham’s original theory. His inspiration was the then-dominant corporatist models seen in 1930s Germany, the Soviet Union, Italy, and other centrally planned industrial giants. He also misunderstood the political importance of the New Deal, claiming it was a sign of American transformation to a managerial state. As Lynn Dumenil noted in her history of interwar America (and her own lectures I attended as an undergrad), the New Deal was not a complete break from Herbert Hoover’s own conception of political economy. Hoover envisioned a form of corporatist planning in which the biggest corporate interests would harmoniously cooperate regarding the most important political-economic issues of the day,with the government as facilitator. The technocratic corporatism implied by Hoover’s vision was Burnham-like, and the New Deal was a continuation of this model. It differed only in that it made the government the driver of industrial political economy instead of designer and facilitator.

However, sustainment of a New Deal-like corporatist model depends on elite agreement. This was not to last. George Packer, Chris Hayes,  and Peter Turchin have all noted that today’s American elites do not have the level of cohesion necessary to sustain a technocratic state. Instead, they are competing with each other in a zero-sum manner. Silicon Valley entrepreneurs have flirted with the idea of secession. The US government cannot pass a budget that funds the government for more than a few months. A “submerged state” of  sub rosa government regulations twists policy towards an affluent few and private interests. The notion that financial regulation was compromised by regulatory capture is not controversial. Finally, a normative conception of elite appropriateness is no longer shared.

What this all suggests is that the impact of an automatic state will be scattered and aggregate. It will be experienced in large part locally through revenue-extracting technologies open up hitherto untapped sources of advantage. Political rent-seeking, not social engineering is the byword. The mechanism of extracting rents, however, is very “managerial” in its operation. In my home state of California, overt attempts to increase revenue have been consistently thwarted by political resistance. The potential for automatic state technologies to become “political technology” that fixes this problem through much less obvious revenue extraction mechanisms is understandably very attractive. However, the ability to process a constant stream of data from automatic state technologies will be contingent on computational power available, which will vary contextually.

Where the automatic state becomes politically and culturally influencing beyond pure rent extraction is also an area where localism will likely matter more. Computational capabilities for automatic enforcement and subtle structuring of political choice is difficult to accomplish on a national level except on a fairly piecemeal way due to national political constraints. However, on a local level where one party or interest may have vastly less constraining influences, it is much more likely that a computational instantiation designed to structure cultural or political choices toward a preferred result could occur. Even without such partisan considerations, there is always a school district that acts to ban a student’s behavior that they dislike or a prosecutor seeking to ramrod a given result that would see such technology as a boon.

All of this isn’t to completely dismiss the potential for federal usage of these technologies. But, as seen in the NSA scandal, mass domestic surveillance in an environment where the public is not afraid of a 9/11-esque event occurring may not be politically sustainable in its current form. A patchwork of “Little Brothers” tied to a revenue extraction mission, however, is a far more diffuse and difficult political issue to organize around.

If the automatic state comes, it is not likely that it will come in the form of a Big Brother-like figure hooked up to a giant machine. Rather, it might very well be a small black box in your car that measures your mileage–and is so successful that it is soon modified to track your speed and compliance with traffic regulations.

8 Responses to “The Automatic State?”

  1. Charles Cameron Says:

    Hi Adam, and welcome to Zenpundit, it’s good to have you with us.
    .
    **
    .
    Big Brother in Orwell is pretty much a “demonic parody” or “inversion”, as Northrop Fry would have said, of the omniscient God of the Abrahamic religions… a topic on which Death and Taxes had a post not so long ago titled Surveillance and God: Religion as NSA-style Big Brother. I’ve cherry-picked from their assortment of quotes to bring you the following:

    Judaism: 
    .
    Proverbs 15.3: The eyes of the LORD are in every place, beholding the evil and the good.
    .
    Psalm 139.1-3: O lord, thou hast searched me, and known me. Thou knowest my downsitting and mine uprising, thou understandest my thought afar off. Thou compassest my path and my lying down, and art acquainted with all my ways. 
    .
    Christianity:
    .
    Hebrews 4.13: Neither is there any creature that is not manifest in his sight: but all things are naked and opened unto the eyes of him with whom we have to do.
    .
    Islam:
    .
    Qur’an 89.14: Indeed, your Lord is in observation.
    .
    Qur’an 5.99: And Allah knows whatever you reveal and whatever you conceal..

    Observation, search, secrets and transparency — there’s a lot of NSA-style activity in these monotheistic theologies to be sure.
    .
    I deduce that your “patchwork of Little Brothers” would be polytheistic by comparison!  

  2. Charles Cameron Says:

    As our regular readers may know, I’m interested in graphics, parallelisms and theological billboards… and the Big Bother / Omniscient God motif is obviously one that has penetrated popular culture…
    .

    .
    I suppose the orthodox response to this would be “God is not mocked” {Galatians 6.7]. The lower of these two images, taken from near Farmington, New Mexico, is itself a juxtaposition of two messages that could provoke interesting meditations on spirit, flesh, virtue, sin, irony, secularism, belief and much more besides.

  3. zen Says:

    .
    All of this isn’t to completely dismiss the potential for federal usage of these technologies. But, as seen in the NSA scandal, mass domestic surveillance in an environment where the public is not afraid of a 9/11-esque event occurring may not be politically sustainable in its current form. A patchwork of “Little Brothers” tied to a revenue extraction mission, however, is a far more diffuse and difficult political issue to organize around.
    .
    Excellent.
    .
    This is a very real political problem. Not least because historically people will then turn to a powerful center, a Tsar, a man on horseback to get the local tyrants under control. While this has been alien to American experience, it was the case for the empowerment of national monarchies in Europe, Stalin, Mao and forgotten conflicts like the Mexican Revolution. 

  4. larrydunbar Says:

    Ah yes. The govenrment knows what kind of state you’re in by the automatics in your life. How novell is that? Electrical power is sold on speculation, so the power companies (water, gas, electricity etc.), along with some energy corporations I assume, have gather information of when, where and how much power you use. They use this information to control the dams and other power stations. So litterally when you turn on that light swithch you are controlling the power stations through out the US. Just not enough that you would notice. Makes a little black box in your car pretty small potatoes. But then it might help the speculators on the where and when, if the where and when was being used deceptively, as to make a difference in the speculation.

  5. david ronfeldt Says:

    adam — greetings.  it’s good to see you join here.  congratulations.
    .
    stimulating post warning about a multiplicity of little brothers instead of one big brother.  i’m glad you mention corporate businesses as well as governments as agents of all this.  there’s lots of evidence about algorithmic automaticity growing in the financial industry.  another area may be the insurance industry (esp. auto insurance, and a particular computer system named colossus).  note that such systems structure choice not only for citizens / consumers, but also for agents and others working within a corporation.
    .
    the term automatic state (or automatic corporation) doesn’t seem quite apt to me.  in some ways, algorithmic seems more apt than automatic, but neither seems to quite capture the varieties and nuances of what you raise (nor is it quite captured by my own preference to talk about cybercratic corporatism).  but the lingo aside, it’s a topic well worth raising, tracking, and worrying about. — onward, david

  6. AdamElkus Says:

    Larry, 
    .
    Not saying that the black box is the only or most relevant example. But that it’s an example of one area where revenue extraction has been somewhat clumsy. Taxes, as noted in the story, is an inefficient means of getting revenue. So this is a more efficient means. More hypothetically: anyone on the road knows that traffic regulations and speed limits — as a technical matter — are flouted at will. Not enough police officers or cameras. More efficient to have regulatory device in the vehicle. 
    .
    David, 
    .
    Insurance industry will probably be the most enthusiastic private sector user. Automatic also clumsy term, maybe “computational” is better.  

  7. david ronfeldt Says:

    yes, i like computational state.  whereas, traditonal state rests on law, this new kind of state rests on code.

  8. Lynn Wheeler Says:

    More concerned with keeping the funds flowing that any mission … in fact, failures increase flow of funds … spreading “Success of Failure” culture
    http://www.govexec.com/excellence/management-matters/2007/04/the-success-of-failure/24107/

    For-profit companies harvest much more money from gathering every piece of data … than from any operations that would turn it into information and knowledge.

    NSA Whistleblower: Government Failed to Stop Boston Bombing Because It Was Overwhelmed with Data from Mass Surveillance On Americans
    http://www.nakedcapitalism.com/2013/11/nsa-whistleblower-government-failed-to-stop-boston-bombing-because-it-was-overwhelmed-with-data-from-mass-surveillance-on-americans.html


Switch to our mobile site