zenpundit.com » Blog Archive » Excess Complexity is the Route to Extinction

Excess Complexity is the Route to Extinction


Nassim Nicholas Taleb, author of The Black Swan: The Impact of the Highly Improbable and Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, had an op-ed in FT.com entitled “Ten principles for a Black Swan-proof world” (Hat tip to John Robb and Pundita). Taleb was addressing the global economic crisis, but I was particularly drawn to Taleb’s fifth principle, which has a more general implication:

5. Counter-balance complexity with simplicity. Complexity from globalisation and highly networked economic life needs to be countered by simplicity in financial products. The complex economy is already a form of leverage: the leverage of efficiency. Such systems survive thanks to slack and redundancy; adding debt produces wild and dangerous gyrations and leaves no room for error. Capitalism cannot avoid fads and bubbles: equity bubbles (as in 2000) have proved to be mild; debt bubbles are vicious.

Taleb has encapsulated many important concepts very well here. Up to a certain point, increasing complexity represents a advantage for an evolving system (biological, financial, physical etc.) by increasing efficiency through adding specialization, interconnection, diversification, redundancy and checks for mitigation of risks. Complexity, in the earlier part of a development curve can add to a system’s overall resiliency – to a point.

Superfluous complexity, that which goes beyond the minimum required for additional gains in systemic efficiency or productivity, is a net drag on the system, an economic waste, a source of friction, a cancer,  a useless eater of resources and the earliest sign of the system’s inevitable decay. Worse, excess complexity represents an increasing probability of systemic failure by multiplying the number of variables involved in the normal process of the system. There are more things that can go wrong and more choke points where a catastrophic failure can occur. Increasing the degree of complexity moves the system away from simplicity and reliability and toward chaos and the creativity of emergent properties, but like an ice skater seeking ever greater range, go too far and the ice will crack under one’s feet.

This is an effect familiar to engineers and scientists but one that appears to escape the majority of politicians, corporate executives and economists. My co-blogger at Chicago Boyz, Shannon Love,  took GE to task for trying to get on the Federal dole by advocating needlessly complicating the nation’s power grid:

If Your Grid Had a Brain

GE is advertising to build political support for Obama’s plan to purchase billions of dollars of GE tech in order to make the power grid “smart”.  After all, who would want a “dumb” anything when they could have a “smart” something? 

The reason we should keep things dumb is that in engineering the word “dumb” has a different connotation. In engineering, “dumb” means simple and reliable. 

Increasing complexity in any networked system increases possible points of failure. Worse, the more interconnected the system, i.e., the more any single component affects any other randomly selected component in the system, the faster point-failures spread to the entire system. Power grids are massively interconnected. Every blackout starts with a seemingly trivial problem that, like a pebble failing on a mountain side, triggers an avalanche of failure. 

In the social and political domain, back in the 1990’s Philip K. Howard wrote a book called The Death of Common Sense: How Law is Suffocating America in which he detailed example after example of how the overlawyering of regulatory systems in America by an emerging and hyper-aggressive legal class was producing neither restraint on government abuses nor fine-tuned social outcomes but instead created a state of paralyzed rigidity, risk aversion, perverse incentives and general dysfunction; in other words, chaos instead of order.

The Obama-ites in the White House are not “socialists” ( at least not most of them) but there is a great love of liberal-minded technocracy there, and a seemingly boundless self-confidence in the ability of high-minded, upper-middle class, progressive, wonks and lawyers from the “good schools” (or investment houses – in some cases, both) to micromanage not just our lives for us, or even the United States of America but the global economy itself. Sort of a Superempowered Oligarchy of Good Feelings.

The ancient Greeks had a word for that: hubris. More importantly, the Obama-ites are wrong here – adding endless amounts of regulatory complexity is not going to give them the kind of granular control or positive returns that they seek to obtain from the system. Counterintuitively, they should be radically simplifying where and to the degree they safely can instead.

22 Responses to “Excess Complexity is the Route to Extinction”

  1. Moon Says:

    Re: complexity, I think the distinction can be emergent — intrinsic, bottom-up — vis directed — extraneous, and top-down.  The former is what we get from the universe, the latter what we deserve when we jack around with her (too much).

    Thanks for the link to Taleb’s FT article.  Can’t seem to access it right now, but I’ll find it somewhere.

  2. Cheryl Rofer Says:

    I think you’ve got Obama wrong, but I’ll let that go for now.
    I agree that additional complexity, particularly for complexity’s (or obscurity’s) sake, as in the case of the financial derivatives, is destabilizing.
    But, as I understand the smart grid, it would be simpler than what we have now, which is unstable because of its complexity.
    The North American electrical grid, as it now exists, is a patchwork of grids that have each grown in their own way, with some overarching structure and a lot of complexity in trying to connect the patches to each other and in that overarching structure to accommodate the differences between the patches. The smart grid would make things more uniform (dumber) and eliminate all that stitching together.
    There are other good things that could be done – the science of control is little discussed, but computers have made some rather wonderful things possible there and would probably contribute to the stabilization of the system.
    It’s not the interconnection per se of today’s system, but the cobbled-together nature of that interconnection that is the biggest problem.

  3. democratic core Says:

    Excessive simplicity is equally seductive.  Saying that systems should be no more complex than necessary is a platitude that mere begs the question, what is necessary?  Howard reminds me of Justice "Ollie" Oliphant in the "Rumpole of the Bailey" stories, who believed that every case could be boiled down to "good old fashioned common sense", invariably reaching the wrong conclusion.  He also reminds me of Einstein’s excellent observation, "Common sense is the collection of prejudices acquired by age eighteen." 

  4. zen Says:

    Hi Cheryl,
    Hopefully you are correct re: Obama administration. We shall see in time.

    Rationalizing the grid system would be good as you describe, so long as the problem of facilitating cascading failures is also addressed.

    Hi dc,
    This isn’t a partisan issue, it’s physics. A perfect example of "excess complexity" in action would be the Pentagon’s weapon procurement process.
    I am not "anti-complexity". Complexity pays real dividends for a system but each addition of complexity to any system brings it closer to the point where it becomes first, a net negative and then a potential disaster.  The key is to look at the potential benefits and costs additional complexity will bring and how new additions are likely to interact with the established system. Adding some complexity to a relatively simple system is fairly easy to estimate, adding more complexity to an already complex system can lead to large, unintended, consequences. Looking to simplify, where complexity is not creating added gains, represents a savings ( money, energy, resources, time). To quote Henri Poincare:
    "If we knew exactly the laws of nature and the situation of the universe at the initial moment, we could predict exactly the situation of that same universe at a succeeding moment. but even if it were the case that the natural laws had no longer any secret for us, we could still only know the initial situation approximately. If that enabled us to predict the succeeding situation with the same approximation, that is all we require, and we should say that the phenomenon had been predicted, that it is governed by laws. But it is not always so; it may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. Prediction becomes impossible, and we have the fortuitous phenomenon. "
    When a system is already complex, the burden of proof should be on the person proposing additional complexity.

  5. democratic core Says:

    To zp –
    Not trying to be contentious, basically we are in accord – it’s always a cost-benefit analysis.  Also, you are very right that this is not a partisan issue.  For example, Krugman has been writing juvenile nonsense like today’s column, http://www.nytimes.com/2009/04/10/opinion/10krugman.html?_r=1 in which he argues that we need to go back to a simpler financial world, what I call "Ozzie and Harriet" banking.  Again, the desire for simplicity is seductive.  But the universe is complex, and you can’t retreat from that.  Today’s world in which you have a billion new capitalists and an expanding global middle class is a lot more complex than the world of Krugman’s childhood, and it is naive, and dangerous, to argue that the problems in this world can be solved by a return to the perceived simplicity of the past. 
    On another note, what do you think of Stephen Wolfram and the notion that randomness may simply be a manifestation of the limitations of the human brain to discern patterns?

  6. Lexington Green Says:

    "complexity, I think the distinction can be emergent — intrinsic, bottom-up — vis directed — extraneous, and top-down.  The former is what we get from the universe, the latter what we deserve when we jack around with her (too much)."
    Moon is right.
    The former is what we get from markets, when they are working right.  It is also what we get from common law adjudication, which is case-specific and inductive and incremental. 
    Complexity per se is not a problem.   The problem is one of rigid, artificial complexity versus flexible, organically-derived complexity. 

  7. zen Says:

    Hi dc,
    It would be nice to get Shane Deichman or Yaneer Bar-Yam to weigh in here because they would explain it better and with much greater precision than me. I’ll send Shane an email
    To answer your second question first, the human brain definitely has finite limits in terms of it’s ability to accurately determine patterns, though I would argue that more often we project patterns where they do not exist ( conspiracy theorists being one example). That is not a phenomenon exclusive to the existence of randomness though and quantum mechanics as a field would mitigate against any mechanistic arguments for determinism.
    I will have to read Krugman but I think there’s a difference in complexity created by an increase in orders of magnitude of multiplicity of players ( that’s actually healthy in that it reduces market distortions by 800 lb gorillas – it shrinks them) and the kind of  complexity created by advanced forms of derivatives that very few ppl actually understand.
    Lex -Common law is a good example of bottom up complexity. I recall reading Lord Chief Justice Coke’s defense of common law findings against arbitrary, top-down, judicial intervention by the King. It’s stil a go argument, though not a great career move for Coke at the time. 🙂

  8. Robert Paterson Says:

    The Dark Ages after the fall of Rome were surely a return to simple – In particular, a Global Food System with "Super Tankers" from North Africa and a Global Linear Defense system with a huge bureaucratic cost collapsed under its own weight.

    The really smart grid would be a local energy system that offered upto 80% of the local need.  We rely on a JIT food delivery system that offers only 3 days of food to a city.  A smart food system would do the same.

    It’s these mega system that carry the complexity

  9. Lexington Green Says:

    "…not a great career move for Coke at the time."
    Lord Coke would have gotten, and deserved, the Big Pair of Stones Award for standing up to King James.   He was a man of principle.  Such men do happen from time to time.
    Also, as I recall, the King "kicked him upstairs" to Parliament, hoping it would shut him up, and it did not. 
    More to the point, the English retained a decentralized, empirical, inductive process of adjudication inherited from Medieval practics, where on the Continent the put in place a "modern", code-based, top-down, centralized system derived form Roman Law.  We Americans inherited and retained, and in fact refined and improved in many ways, the English model.  No small part of our success as a nation has been retaining these "old ways" in the face of "new and improved" approaches, which are actually regressive.  The "Medieval" approach is futuristic.  The "Modern" method is derived from early modern Absolutism. 
    As Richard Epstein put it, you need "Simple Rules for Complex World."  Make clear, simple rules, openly and fairlyimposed, and a lavish, robust and unpredictable complexity will emerge, like vines and fruit on a trellis.

  10. deichmans Says:


    Thx for the invite. I’ll reiterate what I said in my review of Taleb’s _Black Swan_ last year: complexity is not the problem, it’s how we *think* we grasp it. “Hubris” indeed….

    Consider the intrinsic variety (V) of a given system. If that system has a given number of binary inputs (e.g., 8), then V = 2^2^8, or 2^256, which is roughly 1E77 varieties. If you were to break that system into two (each with 4 inputs), the intrinsic variety of this “reductionist” view is no longer 1E77, but only 100,000 (1E5): 2^2^4 + 2^2^4. No net change in input variables, nor in outputs – yet we have filtered out more than 70 orders of magnitude.

    This is the fallacy of reductionist approaches to compexity (e.g., Effects Based Operations): by breaking the problem into pieces, we LOSE the intrinsic variety forever. But we don’t KNOW that we have: we are convinced that we have adequately identified all feasible options, and happily (and ignorantly) proceed.

    It’s much harder to think in terms of systems vice components, but that is where the richness of variety – and truth – is at. Our approach to complexity (seemingly reducing problems by breaking them into parts) is only a thin veil we throw over the problem to deceive ourselves into thinking we know what we’re doing.

    So I don’t believe we are willingly “adding” complexity: Rube-Goldberg notwithstanding, our greatest crime is breaking apart our system-level perspectives and believing that we are getting closer to the truth.

    The reductionist model has its roots in ancient Greek philosophy, but has really taken root since the determinism of Newton (and Poincaré, a staunch Determinist). And it results in suboptimal decision making – at the expense of all nodes linked to the system.

    Rather than striving for “simplicity”, we should rather be seeking to understand MORE of the interrelationships within our complex world. That is the essence of Clausewitz’s “genius”: disregarding the minutae in order to focus on the strategic objective.

  11. zen Says:

    Ah, Shane, you ‘da man! Thank you for clarifying
    "This is the fallacy of reductionist approaches to compexity (e.g., Effects Based Operations): by breaking the problem into pieces, we LOSE the intrinsic variety forever. But we don’t KNOW that we have: we are convinced that we have adequately identified all feasible options, and happily (and ignorantly) proceed."
    This "reductionism", of course, is how modern states approach complex systems – as you put it " our greatest crime is breaking apart our system-level perspectives". I have considered that behavior, which appears as regulatory micomanagement or piecemeal legislative intervention, as "adding complexity" to the system. Perhaps that was not the best way to have articulated the problem in re: to complexity. I will have to re-think some of this.
    Clausewitz, I note, was highly critical of those who tried to impose algebraic systems or recipes ( i.e. Jomini) on war to regulate or prescribe a commander’s conduct. I’ve just received a copy of Antoine Brousquet’s The Scientific Way of Warfare and he dives right into the change of scientific paradigms and the interpretation of Clausewitz.

  12. Larry Dunbar Says:

    "Consider the intrinsic variety (V) of a given system. If that system has a given number of binary inputs (e.g., 8), then V = 2^2^8, or 2^256, which is roughly 1E77 varieties."

    So you are saying we have a 2 x 2 matrix and moving through eight diferent areas of space at the same time. Implicit and explicit rule-set, strongly and weakly enforced. Cut those in half and we think we have reached simplicity but you have only reduced by 1/4?

  13. deichmans Says:


    For the selected # of inputs (in my example, 8), it is really an 8 x 8 x 2 matrix of potential system states.

    Cut the input variables by half, and your subsequent variety *exponent* goes down quadratically.

  14. Larry Dunbar Says:

    "it is really an 8 x 8 x 2 matrix of potential system states."

    That seems simple enough, so the complexity is not in the math, but what the math represents: zero velocity at maximum acceleration as defined. 

    A potential has two possibilities; it exists or it doesn’t exist, so there is a force between these two potentials, but no velocity. Maybe it is accelerating into itself and creating more nothing, quadratically. Now that sounds complex.

  15. Larry Dunbar Says:

    It would be interesting to find out if your friend Shannon Love does have a brain (as in the song title) or if the technology of the grid is becoming emergent and we, the uninformed, have no vision of it. If it is becoming emergent, then perhaps some complexity does need to be added to handle the emerging system. Emerging technology is simplifying in that it is moving forward, or backward. To stay in one place and become irrelevant to the rest of the world creates complexity.But what are we talking about here? This complexity GE wants to add is probably nothing more than a computer and a communication system, which, I have to admit, in the last century was very complex. While the reliability can be called into question, it is the same question that needs to be asked of the internet itself, because the internet or a network like it is really the “Smart” in the Smart Grid. Can the internet, hence the Smart Grid, remain resilient enough to be counted on by the people of the USA and the world? I have my doubts, but they should be the same doubts that need answering for the Smart Grid and the internet. These are doubts that, one way or another, need to be answered.  The emergent technology G.E is talking about is in the electrical current of Thomas Edison replacing that of Westinghouse and Tesla. Edison was all about Direct Current (DC); Tesla developed Alternating Current (AC). DC is harder control than AC, which is largely self-controlling. DC needs to be “Smarter": than the previous AC system.

    It might work, if your Grid had a brain.

  16. Lexington Green Says:

    Confirming:  Shannon has a brain. 
    Speculating:  A better one than Mr. Dunbar’s.
    Suggesting:  Go and tell him he doesn’t have a brain on one of his own posts, and say why you think so, so we can watch him mop the floor with you. 

  17. Larry Dunbar Says:

    "Suggesting:  Go and tell him he doesn’t have a brain on one of his own posts, and say why you think so, so we can watch him mop the floor with you. "


  18. Interessantes woanders (2009.04.11) › Immersion I/O Says:

    […] Excess Complexity is the Route to Extinction […]

  19. Lexington Green Says:

    Larry,  seriously, go talk to Shannon.  He responds to comments.

  20. zenpundit.com » Blog Archive » Excess Complexity is the Route to … Says:

    […] rest is here: zenpundit.com » Blog Archive » Excess Complexity is the Route to … Tags: along-the-361, been-working, black-swan, cherry, consider-the-intrinsic, invite, last-year, […]

  21. Joe Says:

    The problem with our systems are the idea of the Grid itself, completely prone to terrorist attack and dependent entirely on a top down situation. Rather than powered boxes on every house pulling energy from the sun using the technology that has been tied up and gagged by patent law we have this sort of centralized disaster. Some have actually taken initiative and built underground nuclear generators for entire towns that are encased in something to prevent tampering and to in effect give power for deacades, but I don’t see this from Obama, it just sounds like more "we know best" how to do this and ignoring the locked up advanced techonology by pretending it doesn’t exist, I’m sick of these liars. They are currently ripping of people and everyone just takes it, if they can rob you financially why would they give you technology that liberates you from a centralized payment system? Banks function on interest and that means debt, so naturally they need debtors and they also need payers to electric companies, etc. The functional aspects don’t matter just the payment systems.

  22. (old found draft post)CyberWar – Ref Links « PurpleSlog – Awesomeness & Modesty Meets Sexy Says:

    […] https://zenpundit.com/?p=3074 http://www.schneier.com/blog/archives/2007/06/cyberwar.html […]

Switch to our mobile site