Economists wield one of the most influential theories in modern society. If we get it right, the economy can be tuned to reward people in a fair manner, offering a firm foundation on which we can build a pleasant and prosperous society. If we get it wrong, economic rewards will become detached from effort and service. In this case, undeserving people will be able to bask in the sunshine of wealth while hard-working people can be condemned to the cold shadows of poverty. Over time, such outcomes can lead to rot in the foundations of society creating a dangerous situation for rich and poor alike. Economic theory matters.
This book invites you on a journey to explore one of the most unexplored regions of economics: trade in a complex system. The assumption underlying all this work is that the economy is a complex system. The emphasis of this book is on building and expanding our understandings, not criticising the very large body of linear-system theory that underpins academic economics today.
A great deal has been learned from linear-system analyses, the primary tool used by economists for over a century. The century-long use of equation-based research has shaped the nature of economic research questions and canon. What is possible to explore using mathematical equations has largely been explored. What is not has largely been ignored. For example, there are no linear-system models exploring the macro outcome of changing the distance that agents detect sellers and buyers in a system. This is not because economists are unintelligent, but because equation-based models do not lend themselves to this kind of analysis.
If your tool is a hammer, what you look for are nails. The more intelligent you are, the better you are at finding nails. There is nothing intrinsically wrong with nails or hammers. However, if all you have known before is the linear-system economic theory taught to undergraduate university students, it is possible that you will be shocked by the behaviour of trade in a complex system.
In order to examine complex systems, a relevant tool must be used. This work uses agent-based models (computational economics) because it is one of the few tools that can directly investigate emergent behaviour and evolution in trading environments. All the models shown here are built in Netlogo (Wilensky, 1999), an academic multi-agent programming language designed to help teach multi-agent programming to university students. The models used in this book can be downloaded and run on most home computers. You can test the experimental conclusions for yourself or run experiments for your own purposes.
Some readers experienced with multi-agent models might wonder: why use Netlogo? Isn’t Netlogo for amateurs who have yet to learn how to program in a proper language? There are several reasons why Netlogo is a sound modelling choice for this work.
First, much of this work seeks to understand the core properties of trade. This is where things are the simplest and, accordingly, much of the experimentation presented in this book uses a simple model. We do not require such things as supercomputers, C++ or Phython, or multiple threading. On the other hand, the last Netlogo model used in this work includes ‘living’ agents—populations of mortal agents that give birth and nurture children, have evolving propensities, can starve to death, etc. Starvation indicates a failure of distribution, a core economic competency. It is not possible to explore the economics of starvation, for example, in models where life is assumed. If we are already exploring economics beyond most existing models, why do we need a tool even more capable? Finally, Netlogo is one of the most accessible agent-based model languages in existence, with a large online support base. On the whole, Netlogo ticks a lot of boxes.
This book is about economy theory and will not teach you how to program agent-based models. If you are looking for an instructional book on agent-based modelling in economics, there are already good books on the market (e.g. Hamill & Gilbert, 2016; Miller & Page, 2009; Railsback & Grimm, 2011; Wilensky & Rand, 2015).
This book is about moving forward. Where possible, it avoids repeating past criticisms of linear-system economics. There already exists a comprehensive literature challenging the validity of the economic theory as it is taught in universities. In fact, at the time of writing, there is a potential revolution brewing in economics.
Economics is a discipline dogged by criticism from within and without. University students have organised themselves to demand that the undergraduate curriculum be changed (Rethinking Economics) and have recently published what they hope will become a new pluralistic textbook (Fischer et al., 2017). The UK’s main academic economic research funding source, the Economic and Social Research Council (ESRC), has funded multi-million-pound programmes called ‘Rebuilding Macroeconomics’ and ‘Rebuilding Microeconomics’. There are established critical books such as Doughnut Economics (Raworth, 2017), Debunking Economics (Keen, 2011), and The Origin of Wealth (Beinhocker, 2006). A top mainstream economist, Paul Romer, recently published a scathing indictment of his profession called ‘The Problem with Macroeconomics’ (Romer, 2016). The Institute of New Economic Thinking (INET) has an article on their website called Publishing and Promotion in Economics: The Tyranny of the Top Five (Skidelsky et al., n.d.), exploring how and why the top five ranked economic journals promote only a narrow band of research while economists publishing in the top five have the best chances of being hired, getting funding, and being promoted. As early as 1977, physicists have been so fed up with economic theory taught in university that they started building their own theory, possibly starting with Osborne (1977). As a group, they are now called econophysicists. In 1983, a mathematician described ‘How Economists Misuse Mathematics’ (Blatt, 1983). Meanwhile, in the real world where economic theory influences millions of lives, policy largely based on linear system economic models led the population first to an asset bleeding financial crash, and then to ‘austerity’, a policy that has led to people to die in the UK from lack of money (Watkins et al., 2017). Some millionaires, such as Nick Hanauer, see today’s high wealth inequality as a direct result of the implementation of linear-system economic policies.
This is not an exhaustive list of the disquiet surrounding the economics profession. However, the general atmosphere needs to be noted as the environment in which this book was written.
As Kuhn (2012) pointed out, new knowledge can trigger a crisis in scientific disciplines, especially where it touches on core beliefs. For example, at the beginning of the twentieth century, quantum mechanics shocked classical physics with traits that were completely unintuitive and challenged their beliefs about how the universe worked. Many classical physicists resisted, perhaps the most notable being Max Planck himself who, after making one of the key breakthroughs of quantum physics (the quantum), spent years trying to explain his own discovery in a less revolutionary way (McEvoy & Zarate, 2014, p. 43). Eventually all physicists accepted quantum physics and all its implications. As a result, today, I can type on a computer built using knowledge developed from quantum physics.
This crisis of new knowledge was repeated half way through the twentieth century when complexity reared its unwelcome head. In the 1950s, both Alan Turing and Boris Belousov had made significant complexity discoveries. While Turing’s work was cut short when he was arrested for being a homosexual and subsequently died, Belousov’s repeatable experiment was dismissed as impossible. Belousov was so disheartened that he walked away from science altogether.
What was being challenged was a deeply embedded Newtonian philosophy built on linear-system theory and models. Many established scientists resisted accepting the new knowledge as it challenged their core beliefs about how the entire universe worked. Eventually, the evidence became so overwhelming that those continuing to resist risked becoming irrelevant.
Complexity was finally accepted into the bosom of science some years after Edward Lorenz’s (1963) now famous paper concerning the ‘Butterfly Effect’. Today, Belousov’s disgraced complexity experiment forms the basis of an entire section of chemical study called the BZ reaction, while Turing is acknowledged as one of the first mathematicians to grasp the relevance of emergent behaviour in biology.
It has been over half a century since complexity revolutionised the thinking in the physical sciences. In 2019, the economics discipline still trains s...