To encourage distributed robustness we should encourage exploration of the “adjacent possible” through local innovation, rapid prototyping and “fail safe” solutions whilst building in safeguards and respect for agreed limits. One size does not fit all.
For some engineering problems naïve realism and rationalism works (bridges do not usually fall down!) but, we are creating serious problems in disciplines like biology and infrastructure systems by using this approach on life. As is commonly observed, top-down modelling and prediction of self-organised biological and social systems is fraught with danger.
With the rise of social media, of distributed sensing and of information systems giving us access to real-time data about network responses we are dealing more and more with a more social kind of science. (I am sure many scientists will call this a more social “kind-of” science!) Despite our best efforts at naïve realism and our wish that Jonah’s 1st law applies we are stuck with the 2nd law. Instead of structure determining function there is continuous and dynamic reflexive interchange between the two. Such systems are not formally computable.
I have argued that in 2nd order systems there is no system without an observer and that difference and meaning imply boundaries and define the environment. Different entities and agents will see different systems. In human terms this is particularly true in social and institutional contexts where differing mind and value sets defeat inter-disciplinarity (at least until a long process of discussion, alignment of vocabularies and mutual learning has been endured.) Ulrich Beck’s sub-politics make finding any consensus difficult.
A change of the scientific mindset is required to grasp the key role of adaptive localism in action and response. There are no universals in 2nd order systems. Distributed robustness ensures anti-fragile flexibility. Such systems are not optimal or equilibrium systems and we must expect non-stationary and uncertainty. Heterogeneity in design and in function ensures resilience rather than abrupt tipping points.
Such “systems” exhibit unstable and indiscernible cause-effect relationships because of adaptive and reflexive interactions. Distributed robustness gives rise to periods of “free run” between constraints as chance and contingency change components. Evidence is poor because of this. If there are any Bayesian priors they are low-level constraints like the evolved physiology and behaviour of the component agents. This is where we should seek explanations.
Given the importance of local options, chance, contingency and context in both the biological and the social realms, the best we can do may well be to avoid the worst through the management of the “supply side” context. This means acknowledging and encouraging variability – not trying to eliminate it – expecting change and living with uncertainty. Infrastructure projects are usually designed by engineers to eliminate variability rather than work with it.
The rationalists revenge – the approach to systems and complexity via Big Data – can also lead to big error. Resorting to statistical and correlative approaches – effectively theory free science – is no help. We have known for a very long time that correlation does not mean causation and that equifinality disrupts the search for laws. This explains why repeatability is difficult, why there are an increasing number of retractions of manuscripts and “why most published research findings are false”.
Risk cannot be managed in the usual manner – robust distributed systems minimise risk by quickly finding the adjacent possible. Throughout all this 1st and 2nd thoughts (thinking about thinking) are being constantly evaluated by 3rd thoughts. Organisms possess evolved anticipatory models and show intentional behaviour. At the human level the moral stance is becoming more and more important.
Localism in the social/institutional response involves taking into account regional differences in tastes, ethics, values, honour and esteem (even shame). These define the available options and preferences. System and component designs must take local cultures and beliefs into account. This is not an argument for relativism; there are natural and biophysical constraints on human actions. As I have agued for intrinsic and existential value in the ecosystems upon which we depend for our survival then, in various ways, these must be accorded local rights in addition to human interests. Mere monetisation does not suffice.
Accepting the natural variability and only intervening when a threshold is reached has led to progress in environmental management. But, as we would now expect, these thresholds are debatable and set in social contexts. Any such management requires close collaboration and sharing of data, values and intentions between institutions and local communities.
We now have the technology to be able to do this kind of thing in real time. Abandoning the use of universal models for analysis and prediction cuts against the grain of institutionalised science but the received approach is now being replaced by the acquisition of high frequency data combined with rapid social feedback through the incorporation of social media feeds in real time. Data can be obtained from simple in situ monitors.
It seems, in fact, that massive parameterised models – with their associated calibration and validation problems – can often be effectively replaced by distributed in situ data collected in real time. Rather than using a large 3-D hydrological model for flood prediction and warning, it is sufficient to merely have a network of in situ monitors reporting water level and rate of rise at key locations in real time.
Active social engagement can now be encouraged by accessing Twitter feeds and the like in real time – using local residents (with all their values and differing mind sets) as reporters and data gatherers. This has recently been done in Jakarta through the PetaJakarta.org project. By combining data-gathering sensors with local people and their mobile phones, the SMART Infrastructure Facility at the University of Wollongong has effectively constructed one of the first real-time, self-organising socio-technical (SOST) systems.
We actually live in SOST systems, but the feedback (derived from weak evidence) is usually poor and the time lags (due to institutional and social inertia) are long. They have properties that are not amenable to standard scientific methodologies and management. To encourage distributed robustness we should encourage exploration of the “adjacent possible” through local innovation, rapid prototyping and “fail safe” solutions whilst building in safeguards and respect for agreed limits. One size does not fit all. To ensure fairness compensation must be paid locally (and must be seen to be paid). Nothing should be too big to fail, thus avoiding rorts and gaming of the system.
In dealing with natural and man-made self-organised systems we must seek collective solutions to design, management and restoration problems. Solutions between socialism and libertarianism must be found that respect difference and rely on trust, ethics, context and collaboration. Market-based solutions are insufficient because SOST systems are not “efficient”, market traders cannot have complete information and partial knowledge can lead to inequality, herding and other distortions. Market based solutions cannot be applied to intrinsic values either. These are plesionic systems so they must be dealt with on the basis of community justice, equity and fairness.
Massimo Pigliucci has analysed the basis of fairness at length in his “Rationally Speaking” blogs and has concluded that the ethical solution is a combination of virtue ethics and contractarianism – taking both individual and collective values and responsibilities into account. We should remember there are ethical questions that science cannot answer.
Self-organised biological and socio-technical systems do show universal scaling but we must be careful how to interpret these results. The data must not be over-interpreted because many such systems show equifinal central limit phenomena. Where the data are adequate the overall patterns are derived from metabolic constraints on local interactions.
Watersheds and other systems inhabited by people and organisms can exhibit 1/f scaling relationships that may have no stable statistical properties at all; thus making trend detection and analysis a very fraught prospect. Taking averages in these systems is meaningless and destroys information. SOST are nowhere near an equilibrium state; they show non-stationary trajectories and non-normal statistical distributions Nevertheless new information and communications technologies that provide high-resolution data allow us to reveal hidden properties of these systems.
This is a completely different approach from the usual consultant-driven Environmental Impact Statement boondoggle involving naïve realist predictive models with only lip service being paid to community consultation. 
The usual approach to EIS fails on both these counts; the assessments and predictions made are often useless because of epistemic uncertainty and, because the project and the solution are defined before any attempt at community consultation, the community feels disenfranchised. Unstructured complex issues, where cause and effect are unclear, suffer from conflicting values and disputes over what constitutes a fact. The technocratic regulatory model is no longer sufficient because the social commons has becoming the de facto regulator. As Esther Turnhout has written “participation creates citizens”.
Only a more social kind of science can address these concerns.
For footnotes go to: