Skip to main content

Complexity Economics: A New Way to Witness Capitalism

  • Chapter
  • First Online:
The Blockchain Alternative
  • 4691 Accesses

Abstract

Having detailed the current limitations of capitalism, and the possible solution pathways, the final chapter of the book offers the reader an introduction to complexity economics. As we move into a cashless era, it is essential that we develop new economic tools that are capable of leveraging these technologies whilst measuring their impact. Without a means of measurement, it would not be possible to come up with a new theory of capitalism or, as the author states, a buffet of theories. What is required is not a universal law of capitalism but a nuanced selection of theories that are based on different scenarios. As standard tools are illsuited to this task of multiple scenario constructions based on the actions of multiple market players, the author makes the case that the solution lies in borrowing from the learnings of complexity science and the emerging discipline of complexity economics. By providing a vulgarized introduction to the subject, the reader is informed about the new tools and frameworks that need to be constructed to gauge and govern this system. Thus, the reader will be introduced to the principles and models of Econophysics and Complexity Economics, which offer greater mathematical exactitude and a higher probability of identifying systemic risk than current economic models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 24.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 32.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The Royal Society is a Fellowship of many of the world’s most eminent scientists and is the oldest scientific academy in continuous existence.

  2. 2.

    See ‘Technological novelty profile and invention’s future impact’, Kim et al., (2016), EPJ Data Science.

  3. 3.

    The term ‘combinatorial evolution’, was coined by the scientific theorist W. Brian Arthur, who is also one of the founders of complexity economics. In a streak that is similar to Thomas Kuhn’s ‘The Structure of Scientific Revolutions’, Arthur’s book, ‘The Nature of Technology: What It Is and How It Evolves’, explains that technologies are based on interactions and composed into modular systems of components that can grow. Being modular, they combine with each other and when a technology reaches a critical mass of components and interfaces, it evolves to enter new domains, and changes based on the new natural phenomena it interacts with. In sum, Arthur’s combinatorial evolution, encompasses the concepts of invention, biological evolution, behavioural models, social sciences, technological change, innovation and sociology.

  4. 4.

    Even evolution is not free from the combinatorial approach. Charles Darwin best known for the science of evolution, build his classification system on the work of Carl Linnaeus (1707-1778), the father of Taxonomy.

  5. 5.

    The Differential Analyser consisted of multiple rotating disks and cylinders driven by electric motors linked together with metal rods that were manually set up (sometime taking up to two days) to solve any differential equation problem.

  6. 6.

    In economics Kondratiev waves (named after The Soviet economist Nikolai Kondratiev), are cyclic phenomenon that link the cycle of a technology’s invention, expansion and ultimate replacement to their economic effects. Although Nikolai Kondratiev was the first to study the economic effects of technology on prices, wages, interest rates, industrial production and consumption in 1925, Joseph Schumpeter was responsible for their entry into academia.

  7. 7.

    In this paper, the model is driven by technological change that arises from intentional investment decisions made by profit-maximizing agents.

  8. 8.

    See “A Failed Philosopher Tries Again.”

  9. 9.

    (i) LatAm sovereign debt crisis - 1982, (ii) Savings and loans crisis - 1980s, (iii) Stock market crash - 1987, (iv) Junk bond crash - 1989, (v) Tequila crisis - 1994, (vi) Asia crisis - 1997 to 1998, (vii) Dotcom bubble - 1999 to 2000, (viii) Global financial crisis - 2007 to 2008.

  10. 10.

    LHC: The Large Hadron Collider is the world’s largest and most powerful particle accelerator located at the CERN, the European Organization for Nuclear Research (Conseil Européen pour la Recherche Nucléaire). The LHC is a 27- kilometre ring of superconducting magnets that accelerates particles such as protons to the speed of light before colliding them to study the quantum particles that are inside the protons. On the 4th of July 2012, the ATLAS and CMS experiments at CERN’s Large Hadron Collider discovered the Higgs boson, the elementary particle that explains why particles have mass. It was, and will be, one of the most important scientific discoveries of the century.

  11. 11.

    Some of the early trailblazers who combined the study of complexity theory with economics include, Kenneth Arrow (economist), Philip Anderson (physicist), Larry Summers (economist), John Holland (physicist), Tom Sargent (economist), Stuart Kauffman (physicist), David Pines (physicist), José Scheinkman (economist), William Brock (economist) and of course, W. B. Arthur (economist), who coined the term complexity economics and has been largely responsible for its initial growth and exposure to mainstream academia.

  12. 12.

    Knightian uncertainty is an economic term that refers to risk. It states that risk is immeasurable and not possible to calculate. “Uncertainty must be taken in a sense radically distinct from the familiar notion of Risk, from which it has never been properly separated”, Frank Knight, economist from the University of Chicago.

  13. 13.

    The European Central Bank (ECB) has developed a DSGE model, called the Smets-Wouters model, which it uses to analyse the economy of the Eurozone as a whole. (See: Smets and Wouters, ‘An estimated dynamic stochastic general equilibrium model of the euro area’, Journal of the European Economic Association, Volume 1, Issue 5, September 2003, Pages 1123-1175).

  14. 14.

    These constraints include: budget constraints, labor demand constraints, wage constraints (Calvo constraint on the frequency of wage adjustment), capital constraints, etc… (Slanicay, 2014).

  15. 15.

    The Taylor rule is a set of guidelines for how central banks should alter interest rates in response to changes in economic conditions. The rule, introduced by economist John Taylor, was established to adjust and set prudent rates for the short-term stabilization of the economy, while still maintaining long-term growth. The rule is based on three factors: (i) Targeted versus actual inflation levels; (ii) Full employment versus actual employment levels; (iii) The short-term interest rate appropriately consistent with full employment (Investopedia). Its mathematical interpretation is: r = p + 0.5y + 0.5(p - 2) + 2. Where, r = the federal funds rate, p = the rate of inflation, y = the percent deviation of real GDP from a target (Bernanke, 2015).

  16. 16.

    Contract theory was first developed in the late 1960’s by Kenneth Arrow (winner of the 1972 Nobel prize in economics), Oliver Hart and Bengt R. Holmström. The latter two shared the Nobel prize in economics in 2016.

  17. 17.

    https://www.federalreserve.gov/econresdata/frbus/us-models-about.htm

  18. 18.

    As per Turner, Monetary finance is defined as a fiscal deficit which is not financed by the issue of interest-bearing debt, but by an increase in the monetary base - i.e. of the irredeemable fiat non-interest-bearing monetary liabilities of the government/central bank. Eg: Helicopter Money.

  19. 19.

    Daniel Kahneman is known for his work on the psychology of judgment and decision-making, as well as behavioural economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences.

  20. 20.

    Case in point - US investment in developing a better theoretical understanding of the economy is very small -around $50 million in annual funding from the National Science Foundation - or just 0.0005 percent of a $10 trillion crisis. (Axtell and Farmer, 2015).

  21. 21.

    Tait (1831 - 1901) was a Scottish mathematical physicist, best known for knot theory, Topology, Graph Theory and Tait’s conjecture.

  22. 22.

    William Stanley Jevons, Léon Walras and Carl Menger simultaneously built and advanced the marginal revolution while working in complete independence of one another. Each scholar developed the theory of marginal utility to understand and explain consumer behaviour.

  23. 23.

    The MONIAC (Monetary National Income Analogue Computer) was a hydraulic simulator that used coloured water to show the flow of cash.

  24. 24.

    Hidalgo is a statistical physicist, writer, and associate professor of media arts and sciences at MIT. He is also the director of the Macro Connections group at The MIT Media Lab and one of the creators of the Observatory of Economic Complexity - http://atlas.media.mit.edu/en/

  25. 25.

    Econophysics is an interdisciplinary research field that applies theories and methods originally developed by physicists in order to solve problems in economics. Refer Table 2 to see sources of Econophysics textbooks.

  26. 26.

    The First Welfare Theorem: Every Walrasian equilibrium allocation is Pareto efficient. The Second Welfare Theorem: Every Pareto efficient allocation can be supported as a Walrasian equilibrium.

    The First and Second Welfare Theorems are the fundamental theorems of Welfare Economics. The first theorem states that any competitive equilibrium, or Walrasian equilibrium, leads to a Pareto efficient allocation of resources. The second theorem states the converse, that any efficient allocation can be sustainable by a competitive equilibrium.

  27. 27.

    Alan Kirman is professor emeritus of Economics at the University of Aix-Marseille III and at the Ecole des Hautes Etudes en Sciences Sociales. He is a heterodox economist and has published numerous papers and books on Complexity Economics, Game Theory and Non-Linear Dynamics among other subjects. His latest book ‘Complexity and Evolution: Toward a New Synthesis for Economics was published in August 2016.

  28. 28.

    Some other known languages and tools used are Repast and SimSesam. Both these platforms are more advanced than Netlogo but require some previous coding experience in Java. Netlogo is a dialect of the Logo language and is A general-purpose framework. Repast and SimSeam allow for easier integration of external libraries and higher levels of statistical analysis, data visualisation and geographic information systems.

  29. 29.

    Prof. Doyne Farmer is a professor of mathematics at the Oxford Martin School. He is also Co-Director of Complexity Economics at The Institute for New Economic Thinking and an External Professor at the Santa Fe Institute. His current research is on complexity economics, focusing on systemic risk in financial markets and technological progress. During his career, he has made important contributions to complex systems (See Appendix 1), chaos theory, artificial life, theoretical biology, time series forecasting and Econophysics. He is also an entrepreneur and co-founded the Prediction Company, one of the first companies to do fully automated quantitative trading.

  30. 30.

    Jacky Mallett has a PhD in computer science from MIT. She is a research scientist at Reykjavik Universit, who works on the design and analysis of high performance, distributed computing systems and simulations of economic systems with a focus on Basel regulatory framework for banks, and its macro-economic implications. She is also the creator of ‘Threadneedle’, an experimental tool for simulating fractional reserve banking systems.

  31. 31.

    Constant Proportion Portfolio Insurance (CPPI)- CPPI is a method of portfolio insurance in which the investor sets a floor on the value of his portfolio and then structures asset allocation around that decision. The two asset classes are classified as a risky asset (usually equities or mutual funds), and a riskless asset of either cash or Treasury bonds. The percentage allocated to each depends on how aggressive the investment strategy is.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Kariappa Bheemaiah

About this chapter

Cite this chapter

Bheemaiah, K. (2017). Complexity Economics: A New Way to Witness Capitalism. In: The Blockchain Alternative. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-2674-2_4

Download citation

Publish with us

Policies and ethics