The science of complexity gives us a better sense of analyzing the world around us. There are different types of complexity and each have their own proponents. Algorithmic Complexity grasps with a model’s description. Deterministic Complexity guides researchers with the unpredictability of a dynamic system. Aggregate Complexity studies the interactions of individual components in a system.
Here it is important to remember that complexity is not the same as a complicated model. Complex systems are characterized by self-organization, emergence, non-linearity, feedback, and path dependence whereas a complicated model can be a model that tries to closely emulate real-world behavior.
According to the Merriam-Webster dictionary, complexity is defined as the state or quality of being difficult to understand. Social complexity is specific to the field of social sciences. It deals with the behavioral analysis of human society. With time, the social complexity of people has increased. Herbert A. Simon, John H. Miller, Scott E. Page, Nigel Gilbert, and Klaus G. Troitzsch seem to see eye to eye on these points.
However, there is little else they agree on, apart from the proposition that complexity theory taking the center stage in social analysis. Furthermore, they have their own definitions of complexity and its methodology.

Image courtesy: MIT Press
Simon, one of the founding fathers of Artificial Intelligence and a Nobel Prize laureate takes a philosophical view on complexity. His central theme in complexity is on the hierarchy of systems having common attributes. Every complexity architect utilizes hierarchical models for evaluating just how difficult it is to understand. He arrives at this conclusion via the historic route. Hierarchy is a derived product of holism post World War 1 plus chaos theory as well as catastrophe theory after World War 2.
Holism focuses on the theory of a whole interconnected by parts which are necessarily dependent on each other. Gestalt’s principles (An organized whole is perceived more than the sum of its individual parts) aid in understanding this concept better. The catastrophe theory is the mathematical idea of an object at steady-state switching to an unstable state where each of its variables also changes. The emergent phenomenon is known to be patterns that emerge from the local interaction between any individual entities in a system.
Thus, a system evolves in such a manner that its agents react with each other to bring about a change in state. When this happens at multiple levels, it is known as hierarchical systems forming an architecture of complexity. There are four aspects of hierarchy that contribute to the architecture of complexity.
- Systems are of a certain complexity. They consist of subsystems that are in a particular inter-related order.
- A complex system’s structure is dependent on a period of time to evolve at a certain rate.
- The dynamic properties of an organized system can be decomposed to its subsystems to better understand the system’s behavior.
- Complex systems have common properties within their hierarchy relating to their description.
Miller and Page focus on emergence as a result of interacting agents in a system as a complex system. They espouse the idea of complexity as the push and pull of feedback loops in a system. They call the emergence of localized behavior with respect to a global behavior stacked emergence much like Herbert’s hierarchy architecture of models. The cause and implication of complexity depends on the type of complexity being observed:
- Disorganized complexity cancels out interactions between the local agents. The system’s dynamics do not change.
- Organized complexities input feedback into the system with interacting agents. The system’s dynamics are influenced by the feedback that the system receives from its local agents.
Depending on the disorganized or organized notion of the model which gives its emergent properties, its complexity is determined.
Simon’s hierarchical complexity is not interested in differentiating the models by their organized nature. Emergent properties of a model are via the interactions between agents while Miller and Page believe such interactions are to be ignored unless within a feedback mechanism.
While Simon considers every system in the universe including the entire solar systems, atoms, and social spheres as hierarchical, Gilbert and Troitzsch are of the opinion that that human societies are different. Societies are unpredictable. Complex models according to Gilbert and Troitzsch are the ones with the system behavior that cannot be subdivided to be understood. This is stark contrast to Simon as well as Miller and Page’s contribution to the definition of complexity. Gilbert and Troitzsch point to the non-linear transmission of knowledge in individuals as the deciding factor for a model’s complexity. Miller and Page dismiss disorganized complexity and refer to any changes in the system as the result of the system’s feedback. Gilbert and Troitzsch are also wary of complexity theories overpowering the actual study of social phenomena.
Gilbert and Troitzsch would prefer to decide on the optimal amount of real-world complexity that a model should imitate. The art of modelling is in balancing the simplicity of a model with enough ‘meat’ in the model to implement with.
There are slight differences in the way the three explanations were thought through, but each of them was eventually instrumental in determining the complexity of a model in computational social science study.
One of history’s first social simulation was the von Neumann machine which gave the world the first cellular automata, a self-replicating model of tiny cells in a grid that depended on their neighbours to change their state. Jon von Neumann and Stanislaw Ulam knew hit and trial experimentation of neutron behavior was too expensive and complicated so they discovered a way to simulate a Turing machine instead.

John Conway constructed another cellular automata, “The Game of Life” that mathematically provided proof on the power of the universal Turing machine. Anything computed algorithmically could be computed within the parameters used to construct the game. Thus was formed a new methodology for research beyond the theoretical, a priori research, or pattern searching.

Simulations are replications of the real-world. In research methodologies, they are useful in “what-if” scenarios that are not feasible to perform in reality. Simulation in social sciences is the third way of evaluating, analyzing, and understanding the world around us, apart from scientific induction or deduction. The value of simulating a model of the world is a system of inputs processed for observing the corresponding outputs. Even if the output is hypothesized, simulations are useful for a variety of operations.
Simulation is used in the following ways:
- Prediction of a system’s behavior from a set of complicated inputs.
- Performance of a system for its functional optimization.
- Training for accuracy, optimization, etc. in a safe, dynamic, and interactive environment.
- Entertainment of the senses with virtual or augmented simulation.
- Educating different experiences, principles, relationships, concepts, etc.
- Proof of concepts at any scale, limited only by imagination.
- Discovery as a by-product of prediction, performance, education, etc. simulations.
Therefore, simulations can practically be used in any sphere of life, from economic models to biological systems. However, there are limiting factors less poetic than lack of imagination. While imagination contributes to the possibilities of more or better assumptions built into a system, it is important to keep level-headed for simulation-based study. The simulation models must be simple enough to be replicated and understood easily. This is called the “Keep it simple, stupid” (KISS) strategy or the “reductionist hypothesis”. Even with such simplicity, agents interact with each other and their surroundings such that a model’s complexity increases indefinitely. Yet another dominant form of modeling is based on human rationality. We live in a largely unpredictable society. The very nature of unconstrained human rationality invites randomness to the simulation model, thus giving indeterminate results.
One of the earliest large-scale simulations was the Simulmatics project to determine the voter reactions to John F. Kennedy’s presidential measures. Voter behaviors were analyzed as far back as in 1962. The results in hindsight were close to the actual resulting reactions even as Ithiel De Sola Pool and Robert Abelson acknowledged that the results would have been just as correct using other techniques. Given the amount of data available now, which is considerably higher than in 1962 itself, it would require a computing machine of a high performance and processing power to work out even a simple model simulation.
Computers perform simulations only as they are programmed to empirically, with the given inputs only. It can be argued that logical gaps, biases, depth of knowledge, etc can be determined better when the simulations do not concur with what at first seemed the obvious answer to a problem statement that a model is simulating.
Incorporating a variety of tools and methodologies can even make for a better and more sound theory. This means more number of runs in a simulation, more corroboration with different models over a given problem statement.The objection to computational social sciences that computations are built in to give certain predetermined results is true in effect. Nonetheless, the computational tools are not to be blamed for it. Efficient models are not of a black-box nature. In fact, they incorporate factors like emergence into the model.
Moreover, the objection to the artificial world of a simulation is with respect to the utility of a model, not its validity. The simulated nature of the seemingly abstract weather prediction models does not invalidate meteorological systems. A model is only what it was constructed to be, but that does not reduce its authenticity. A well-constructed model will have undergone several validation and verification steps beforehand.
All this to say, abstract concepts are better understood when actually experienced, even as a simulation, however incomplete it might seem for the sake of simplicity.
System Dynamics models synthesize the real world into a network of stocks and flows between a source and a sink. They have defined boundaries for a single monolithic model. They have a mathematical strategy for modeling dynamic social processes. It is readily apparent in Weidlich’s oversimplified system dynamic model of political opinion formation. The elements are self-contained such that they both react to, and create the political climate within the system dynamic model. It is an entire system as a whole without the individuals in the analytical paradigm.
Micro-analytical Simulation, on the other hand has two main levels – the aggregate and individual level, both of which are heavily data-oriented prediction. Data collection forms the basis of “truth” for a micro-simulation model. In a demographic model of population, the structure changes endogenously and will need to be re-weighted. In dynamic micro-simulations the data records “weights” between the individuals and their attributes, giving multiple sub-processes for both an overall eagle-eye perspective and single individual understanding at the same time. Static micro-simulations do not take into account changes with time or situations.In the same model, a System Dynamic model will obtain a deterministic result with the same predictions as the Micro-simulation model, using mathematical formula. The methodology used varies, but the eventual prediction is the same. This allows for better verification opportunities.
Agent-based modeling breaks a real-world problem to its physical agents whereas System Dynamic modeling breaks down the problem into a network of stocks and flows according to the functions they are ascribed to perform. Agent-based models are similar to Microsimulation models at individual levels, especially dynamic Microsimulations since they factor time as a variable into the model. However, agents react with their environments in agent-based modeling.
Any Micro-simulation model or System Dynamic model can also be an Agent-based model but the vice-versa is not necessarily true.
References
- Gilbert, N. and Troitzsch, K.G. (2005), Simulation for the Social Scientist (2nd Edition), Open University Press, Milton Keynes, UK.
- Miller, J.H. and Page, S.E. (2007), Complex Adaptive Systems, Princeton University Press, Princeton, NJ.
- Simon, H.A. (1996), The Sciences of the Artificial (3rd Edition), MIT Press, Cambridge, M. A.
- Cioffi-Revilla, C. (2017), Introduction to Computational Social Science: Principles and Applications, Springer, New York, NY.
- Axelrod, R. (1997), ‘Advancing the Art of Simulation in the Social Sciences’, in Conte, R., Hegselmann, R. and Terno, P. (eds.), Simulating Social Phenomena, Springer, Berlin, Germany, pp. 21-40.
- Ithiel De Sola Pool, & Abelson, R. (1961). The Simulmatics Project. The Public Opinion Quarterly, 25(2), 167-183. Retrieved from http://www.jstor.org/stable/2746702
- Birkin, M. and Wu, B.M. (2012), ‘A Review of Microsimulation and Hybrid Agent-Based Approaches’, in Heppenstall, A.J., Crooks, A.T., See, L.M. and Batty, M. (eds.), Agent-based Models of Geographical Systems, Springer, New York, NY.
- Crooks, A.T. and Heppenstall, A.J. (2012), Introduction to Agent-based Modelling, in Heppenstall, A.J., Crooks, A.T., See, L.M. and Batty, M. (eds.), Agent-based Models of Geographical Systems, Springer, New York, NY.
- Epstein, J.M. and Axtell, R. (1996), Growing Artificial Societies: Social Science from the Bottom Up, MIT Press, Cambridge, MA.
- Weidlich, W. (1994), ‘Synergetic Modelling Concepts for Sociodynamics with Application to Collective Political Opinion Formation’, The Journal of Mathematical Sociology, 18(4): 267-291.
- Waldherr, A. and Wijermans, N. (2013). Communicating Social Simulation Models to Sceptical Minds. Journal of Artificial Societies and Social Simulation
1 Comment