key: cord-349581-o320ogmg authors: Robertson, Lindsay J. title: The technological 'exposure' of populations; characterisation and future reduction date: 2020-05-25 journal: Futures DOI: 10.1016/j.futures.2020.102584 sha: doc_id: 349581 cord_uid: o320ogmg The nature and level of individuals' exposure to technological systems has been explored previously and is briefly restated here. This paper demonstrates how the concept of technological exposure can be extended to generic needs of individuals, and further to the needs of populations of individuals and even as far as “existential threats” to humanity. Technological categories that incur high levels of population exposure are explored, and categories are described. A theoretical basis for reducing population exposure is developed from the basic concepts of technological exposure. Technological developments that potentially enable less centralised societies having lower levels of population exposure, are considered for practicality and effectiveness as are the factors that could allow and cause transition to a less technologically centralised model. Some conclusions regarding practicality, triggers, and issues arising from a decentralised society are considered and include the key conclusion that a higher level of decentralisation and exposure reduction is both desirable and possible. The fundamental impossibility of never-ending growth have been recognised for a long time (e.g. Daly 1990 , Henrique et al. 2013 ) and similarly authors (e.g. Rao 1975) have recognised the danger of corporates of transnational size. This paper considers an associated but distinct issue, the growing technological vulnerability of populations. Increasingly, essential goods and services are only accessible via technological systems that are both sophisticated, and also centralized. In this situation, end-user technological vulnerability becomes significant, but quantifying the extent and nature of such vulnerabilities have been hindered by the complexity of the analysis (Haimes & Jiang The supply of essential goods and services will involve heterogeneous systems (i.e. systems where goods/services are created by a progression of steps, rather than a system in which completed goods are simply transported/distributed), that involve an arbitrary number of linked operations each of which requires inputs, executes some transformation process, and produces an output that is received by a subsequent process. Four essential principles have been proposed (Author 2017) to allow and justify the development of a metric that evaluates the contribution of a heterogeneous technological system, to the vulnerability of an individual (Author 2017a). These principles are: 1) The metric is applicable to an individual end-user: When an individual user is considered, not only is the performance of the supply system readily defined, but the relevant system description is clearer. 2) The metric acknowledges that events external to a technology system only threaten the output of the technology system if the external events align with a weakness in the technology system. If a hazard does not align with a weakness then it has no significance. Conversely if a weakness exists within a technological system and has not been identified, then hazards that can align with the weakness are also unlikely to be recognised. If the configuration of a particular technology system is changed, weaknesses may be removed while other weaknesses may be added. 3) The metric depends upon the observation that although some hazards occur randomly and can be assessed statistically, over a sufficiently long period the probability of these occurring approaches 1.0. The metric also depends upon the observation that intelligently (mis)guided hazards (i.e. arising when a person J o u r n a l P r e -p r o o f intentionally seeks a weakness in order to create a hazard) do not occur randomly. The effect of a guided hazard upon a risk assessment is qualitatively different from the effect of a random hazard. The guided hazard will occur every time the perpetrator elects to cause the hazard and therefore such a hazard has a probability of 1.0. 4) The metric depends upon the observation that it is possible to not only describe goods or services that are delivered to the individual (end-user), but also to define a service level at which the specified goods or services either are or are-not delivered. This approach allows the output of a technological system to be expressed as a Boolean variable (True/False), and allows the effect of the configuration of a technological system to be measured against a single performance criterion. Applying these principles, the arbitrary heterogeneous technological system supplying goods/services at a defined level to an individual, can be described by a configured system of notional AND/OR/NOT functions. Having represented a specific technological system using a Boolean algebraic expression, a 'truth table' can be constructed to display all permutations of process and stream availabilities as inputs, and technological system output as a single True or False value. From the truth table, a count of the cases in which a single input failure will cause output failure, and assign that total to the variable "E1". A count of the cases where two input failures (exclusive of inputs whose failure will alone cause output failure) cause output failure, and assign that total value to E2. A further count of the cases in which three input failures cause output failure (and where neither single nor double input failures within that triple combination would alone cause output failure) and assign that total value to the variable "E3" and similarly for further "E" values. The exposure metric, {E1, E2, E3…}, for a defined system supplying specific goods/services, can be shown to measure the level of vulnerability of the individual to that technological system (Author 2017). For many practical cases a generic service could be considered -e.g. "basic food" rather than "fresh milk", and vulnerability assessed by nominating multiple specific services as alternative suppliers (an "OR" gate) of a nominated generic service. The exposure of the individual to lack of secure access to the necessities of life, can thus be assessed using exactly the same approach used to assess exposure to nonavailability of any specific need. Illustrative examples of generic services could include "All needs for Maslow's physiological level". Conceptually this would include ((Food_A OR Food_B OR Food_C…) AND warmth AND water AND shelter) or possibly "all needs to occupy apartment on month-by-month basis". (Sewage services AND water supply AND power supply). The concept of an individual's exposure may be yet further modified and extended to consider the quantification of a population's exposure, in regard to supply of either a particular, or a generic service. It is common for some "population" to need the same services/goods, and obtain these via an identical technological system. The "technological system" that supplies such a population may not be identical to that which supplies an individual: Consider a case in which four individuals each rely on their cars to commute to work; for the output "arrive at work", each of their individually-owned cars contribute to the E1 exposure of each of the 4 individuals. If they reach a car-pooling agreement, the "car" then contributes to the E4 exposure for J o u r n a l P r e -p r o o f "arrive at work" of each individual, since any of the 4 cars can supply that functionality. A population exposure is developed by modifying an individual exposure evaluation (for a population using a functionally identical service) by considering each functional element used by more than one of the population, and whether each such functional element can be made accessible to more than one. If an exposure level (metric) is then nominated, it is possible in principle to establish the largest population which is subject to this level of exposure. For example, we might identify the largest population using a functionally identical service or supply of goods, that has a non-zero E1 exposure (i.e. has at least on single point of failure): Should a city discover that the progressive centralisation of services had resulted in a situation where the whole population of the city had a non-zero E1 value for an essential service, some urgent action would be strongly suggested. Conversely, if the supply of generic service to even small suburbs had low exposure values, then a good level of robustness would have been demonstrated for that generic service, at the suburb level. Extrapolating these principles, where a "population" is actually the totality of the homo-sapiens species, and the "service" is essential for every individual's survival, then a definition of "existential threat" as proposed by Bostrom (2002) , is approached. The analysis of exposure considers loci of failure and hence intentionally avoids consideration of specific hazards or their probabilities. Nevertheless, hazards that affect various populations certainly exist, and a short list will serve to illustrate the significance of evaluating population exposure levels. The following list makes no attempt to be comprehensive, but illustrates hazards relevant to large corporates, J o u r n a l P r e -p r o o f apartment-dwellers, members of isolated communities and users of electronic communications.  A large group could be criminally held to ransom by malicious encryption of essential data (Pathak 2016) .  A large group could be effectively excluded from access to a service because of conditions imposed by a (single, national) owner (Tanczer, et al. 2016 , Deibert 2003 , Sandywell 2006 .  An isolated group (on Antarctica, Moon or perhaps Mars) could cease to be viable if they cannot import supplies that cannot be generated locally (P?kalski & Sznajd-  The occupiers of an apartment block could encounter a situation in where there was minimal (financial or other) pressure on the provider of sewage removal services to restore service, but where the occupiers faced major consequences (need to de-camp) in the absence of this service. Similarly, (Author 2017a) has noted that a failure to provide financial transaction services to a small retailer may incur negligible financial cost to a bank, but may cause the retailer's business to fail. These are examples of asymmetric consequences.  The whole of humanity exists on the planet earth: a failure of the natural environment can doom either a regional or possibly a global population (Diamond 2005) , and can be certainly be considered to be an "existential threat", as defined by Bostrom (2002) .  Persons born early in the 20th century were very familiar with the possibility of highly unjust treatments (e.g. loss of employment), if private information became known -and they acted accordingly. For many born early in this century, such concerns seem remote. Personal information privacy issues have thus failed to gain a J o u r n a l P r e -p r o o f high visibility, and we have perhaps become blasé about the costs of lack of privacy. Nevertheless, the ubiquity of surveillance (Miller 2014 , Citron & Gray 2013 , the increasing frequency of major data breaches (Edwards et al. 2016) , and the recent rise in algorithmic decision-making (Moses 2018) on such matters as health insurance, credit worthiness, validity of job application and security targeting are bringing this issue to increased prominence.  The recent revelation of hardware vulnerabilities in general processors (Eckersley & Portnoy 2017 , Ermolov & Goryachy 2017 ) demonstrated the significance of unappreciated weaknesses, even when overall operation of a system has been reliable. Other publications (Author 2010 , Martin 1996 have noted society's vulnerability to technological systems and considered the relationship between time-todisaster and time-to-repair: that analysis was not an exposure analysis, but did consider the range of services corresponding to a broad category of human needs and effectively considered the significance of the exposure categories noted above. Later works (Author 2017a) have analysed the individual exposure associated with a broad range of goods and services. Those works also noted that the analysed values for these specific examples could be reasonably extrapolated to many other similar goods and services. The general analyses of (Author 2010) did attempt to cover the scope of individual needs and as such identified a range of services (or goods-supply needs) that were considered to incur high levels of vulnerability. The more detailed exposure analyses of (Author 2017a) were indexed to the exposure of an individual, and although a limited number of examples were studied, these were considered to be representative of the high-exposure items relevant to J o u r n a l P r e -p r o o f individuals living in urbanised settings in the 21st century. The assessment of the population exposure of those exemplar technological systems shows several distinctive changes from the analysis indexed to the individual. Primarily, the E1 items that arise close to the final point of use by the individual lose significance and for many cases cease to be significant at the E3 level (arbitrarily chosen as the cut-off point for consideration). The components that remain significant are those that genuinely represent exposure values for the population under consideration. The analysis of population exposure of a broad spectrum of needs showed that it was possible to identify some high exposure technological fields, specifically complex components, complex artificial substances, finance, communications, energy and information, These are considered more fully as follows;  Complex components: Complex components may be distinguished from large components (such as a building, hydroelectric dam, or road-bridge), and qualitatively described as being beyond the capacity of a skilled individual craftsperson to fashion in reasonable time and to required tolerances. Under this category we might consider such items as a carburettor body for an internal combustion engine, food processing equipment and construction machinery (crane). For many consumer items within the "complex component" category, the population exposure is significantly lower than the individual exposure, since numerous examples of item exist (either owned by others, or stockpiled for sale). The level of centralisation may however be very high, with (for example) perhaps only one worldwide source of a carburettor body for a specific vehicle.  Complex artificial substances: Complex artificial substances includes advanced metallurgical alloys, advanced plastics and composites, drugs, and vaccines: These J o u r n a l P r e -p r o o f are distinguished by the complexity of composition, rather than complexity of form. In many cases the centralisation that causes high exposure levels for the production of complex substances, has resulted primarily from available economies of scale, and only secondarily from the substances' complexity. Some complex substances (notably pharmaceuticals) have patent protection, which creates centralisation at up to global level.  Finance: As the range of goods and services, and the geographical scope of supply of those goods and services has increased, so has the need for the facility to exchange value in return for goods and services: this has inevitably led to elaborate mechanisms for secure exchange of value. Recognising that the exchange of value can also facilitate illegal activities, state actors have also enacted significant surveillance and control of financial transactions. Technologies associated with the exchange of value have acquired high large levels of exposure (colloquially, many things that can go wrong) and high levels of centralisation, in the process of meetings these demands.  Communications: The ubiquity of the internet has become a remarkable feature of the last 20 years: although there are exceptions, most areas of the earth and a very large proportion of total population can communicate via internet. Although cellphone use is common, it is evolving as a mobile access connection, with the internet carrying the data. Internet communications has been made possible by wellestablished protocols, however high level systems for routing continue to be centralised, and while problems are rare, nation-states have occasionally decided to discontinue connection to the internet, showing that high levels of centralisation exist (Howard et al. 2011) . Although the feasibility of internet communications can be attributed to open source protocols, the practicality of current connectivity has been J o u r n a l P r e -p r o o f largely enabled by the very large data rates possible via fibre-optic cables, yet this capacity has also introduced a high level of exposure for both individuals and populations. Duplication of undersea fibre-optic cables has somewhat reduced the level of population exposure, yet the trend for dependence on high-data-rates has increased at similar pace (market forces have driven the need for additional cables), and communications via internet carry a high level of population exposure. Recent reports (Rose 2017 , Clark 2016 have noted that the disruption of a very small number of fibre-optic cable would lead to unacceptable slowdowns, and trends to ownership of undersea cables by corporate entities further contributes to E1 values of exposure and high population exposure. The numbers of cables in service show that internet communications is centralised at a small-nation level.  Energy: Energy has been noted as key to civilisation (Smil 2017) . Coke allowed the smelting of iron, and oil enabled almost all current transportation. Coal, nuclear and geothermal heat and hydro storage allow electricity generation on-demand. National power transmission systems generally have high reliability and many have some level of design redundancy. Nevertheless, the generation and distribution of electric power incurs significant individual and population exposure: Large power stations are bespoke designs as are large transformers (Sobczak & Behr 2015) , and transmission networks may face significant "resilience" issues (Carvajal et al. 2013 , Sidhu 2010 . Liquid fuels can be stockpiled at national level, and to a lesser extent locally, however the level of stockpiling is limited -and even the higher levels of stockpiling are likely to be small compared to the time to rebuild a major production plant (terminal, or refinery). At the consumer and industrial user-level, although solar PV and wind can produce power at $/kWh rates close to thermal generation, but currently no economically viable technology that allows storage of MegaWatt-hours  Information: Information, whether medical reference information, contractual records or engineering design information, is not a coincidental by-product of a technological society, it is information that fundamentally allows the design and construction of technological systems and to the full operation of society (Dartnell 2014 , Shapiro 2009 , van den Heuvel 2017 ). Yet while it has become possible to generate and transmit enormous quantities of information, the information storage remains a particularly high-exposure issue (Bergeron 2001) . Currently the ASCII codes largely standardise information representation, and protocols for transmission are also largely standardised but a gap in the standardisation of information storage (including writing and recovery) contributes to a high exposure for the final delivery of information to users. Hard-disk drives are still the most common storage technology, yet these have short lives, and use proprietary data storage formats plus proprietary approaches to the writing and recovery of data. This issue has been wellreported, authors such as (Cerf 2015) have predicted a "digital dark age", i.e. a future society that cannot recover/read most of the information generated in the current era. This describes a situation of high individual and population exposure, and since there J o u r n a l P r e -p r o o f are few of manufacturers of HDD's, information storage can also be noted to illustrate centralisation at multi-nation level. Several technologies allowing longterm storage have been proposed (Kazansky et al 2016, Longnow foundation; Nanoarchival™ technology) but these are currently expensive and lack the integration to allow them to truly offer changes to population exposure. By contrast, some fields have generally low population exposure: basic building materials, foodstuffs, natural fabrics and clothing are commonly supplied via relatively simple technological systems in which technological design redundancies are commonly large, and for which population exposure is therefore low. Technologies such as creation of ceramics and glassblowing may require skill but are also not dependent on high technological sophistication and so contribute low technological exposure. Similarly, the collection and storage of rainwater is feasible with low technological exposure for even densely populated urban populations. In addition to identifying high exposure fields, it is also possible to identify categories of exposure contribution that commonly apply across a range of technological fields. These are proposed to include initial resource availability, complex unit operations, lack of buffering, single points of failure (SPOF), contributory systems, highly centralised processes and "practical unavailability", and are described more fully as follows: Initial resource availability: All technological systems producing goods and services for users, ultimately depend upon raw materials and viable environmental conditions. Where raw materials cease to be available, or environmental conditions change permanently, services to users will inevitably be affected. Raw material supplies and acceptable environmental conditions must therefore be identified (Diamond 2005) J o u r n a l P r e -p r o o f as sources of exposure and hence vulnerability to users. Complex unit operations: We use the descriptor "complex" as a characteristic of a process whose internal operation is practically unknowable to the user and cannot realistically be repaired by the user. Personal computers, routers and related equipment are examples. It is also possible to consider situations where a critical application has been compiled from an outdated programming language and runs on a computer for which no spare hardware is available (Hignet 2018). Another example might consider critical information held on a very old storage medium (Teja 1985) . These examples illustrate three categories of complex processes: in the first case, while the inner workings of a PC may be exceedingly complex, the format of incoming data (TCP/IP packets) and protocols (WWW, email etc.) are in the public domain (Fall & Stevens 2011 ) and so it is not only possible but practical for alternative machines to offer the same services. In the second case, assuming the functional specifications of the application processes are known, the application can be re-coded (and fully documented) using a language for which larger numbers of maintenance programmers exist, and on a more common platform. The third case of data encoded on old storage medium, illustrates a subcategory where the internal details of the storage are proprietary (not in public domain), alternative equipment is unavailable, and creation of replica equipment for reading the data is probably impractical, leading some authors e.g. (Cerf 2015) to express fears of a "digital dark age". Lack of buffering: For the supply of long-life products, it is both possible and practical to provide buffer stocks at various points in the process. By contrast, since AC power (for example) is not readily storable, all processes involving uses of AC power will fail immediately if the power supply fails. Single points of failure (SPOF): All single points-of-failure (SPOF) contribute to E1 values and so make a primary contribution to users' vulnerability. Three subcategories of SPOF are noted: The first is J o u r n a l P r e -p r o o f where delivery of services to users involves some processes immediately adjacent to the user, known as "last mile" services in the telecommunications field. The second subcategory of SPOF is illustrated by considering a small rural town whose EFTPOS, landline phone service, cell-phone service and internet connection have all been progressively migrated to data services, carried by a single fibre-optic cable and thus have inadvertently created a SPOF. The third is where a particular failure will inevitably cause failure of other components that are not functionally connected -a cascading failure. Finally it is noted that Contributory systems are a common source of exposure: whenever a system is made dependent upon another, the contributory system's exposures are reflected in the total exposure to the user. A common example is where a simple user service is made dependent upon internet access; the mandatory internet access may add huge levels of exposure to a system that would otherwise incur low levels of vulnerability. Some specific technologies including artificial intelligence, nuclear weapons and asteroid strikes have been examined, and authors such as Baum (2018) have pondered their potential to incur existential threats by threatening multiple systems. Others including Baum (2015) and Green (2016) have considered approaches to limiting the scope of such hazards. The above categories of exposure may apply to any technological process; there are additionally several categories that are specifically relevant to the study of "population exposure", these include Highly centralised processes, for example, the evacuation and treatment of sewage requires a network of pipes and pumps to collect sewage and deliver it to the treatment station. This is an example of a centralised system that is large but technologically simple. Other examples of large centralised systems include financial transaction systems (O'Mahony et al. 1997) , and the international data transmission systems of undersea fibre-optic cables (Clark 2016) . Such systems tend to J o u r n a l P r e -p r o o f be monopolies and are commonly controlled by entities that have little if any obligation to provide service or to negotiate terms acceptable to individual users. Authors such as Li et al. (2019) have shown that highly interconnected systems have similar characteristics to centralised systems, and it is well-recognised that the most "centralised" system is earth's natural environment, because the natural environment (including air and water) are essential to all life. Practical unavailability: Consider the hypothetical case where a user wishes to communicate sensitive information, but only has access to one data transmission facility that is known to be under surveillance. Although technically operational, the inevitable absence of privacy associated with that data transmission facility has made the facility practically unavailable. For technological systems that are highly centralised and near-monopoly, practical unavailability is a significant possibility. The En metric has been explained earlier in terms of its significance for measuring vulnerability, however a wider and more future-oriented applicability of the metric itself can also be demonstrated in several ways. The assumption that there is an average cost of defending any vulnerability could be challenged but across a broad enough spread of examples, it is workable. Under that broad assumption, whereas the cost of defence for an E3 exposurecontributor is precisely the same as the cost of defence for a contributor to an E1 vulnerability (it is the cost of protecting one only vulnerability, since the E3 contributor requires successful attack of all three vulnerabilities), the cost of mounting a successful attack on a E3 vulnerability is 3 times greater than the cost of attacking a vulnerability J o u r n a l P r e -p r o o f contributing to the E1 value, since the attacker needs to identify and successfully attack all 3 E3-contributory nodes simultaneously. In addition to its value for measuring vulnerability, the exposure metric is therefore also broadly significant for planning the reduction in vulnerability of a system. Considering the generation of an exposure metric, the process itself provides valuable insights into options for reduction of the final values. The process of generating an exposure metric must be started from the delivery point, and follow a systematic redrawing and track-back process until a justifiable end-point is reached. The selection of a justifiable end-point has been addressed elsewhere, and could be associated with "no further contributors to an En level" criterion. The process of achieving a final representation of the system will very likely require a progressive redrawing of stream and process relationships; an example of a common re-drawing is presented in Figure 1 . In a practical process of building an exposure metric, subsystems (which may have been analysed separately) commonly contribute goods or services to "higher" systems. If the exposure metric of a subsystem is known then there is little value in recreating a truth table for a higher level super-system, that re-explores all of the inputs to every sub-system. The more-effective approach is to consider the point at which the subsystem output contributes to the input of a gate within the higher level system, and how the subsystem exposure metric is transmitted/accumulated by the upstream gate of the super-system. We may generalise this process by considering that each input to a gate (Boolean AND or OR operation) has an exposure vector, and developing the principles by which the gate output can be calculated from these inputs. This is illustrated in Figure 2 . For the AND gate, the contributory exposure vectors are simply added. For the OR gate, the issue is more complex, but the higher levels of exposure are quickly avoided. The process of generating the metric will itself highlight sources of large exposure, and hence specific options for reduction. The detailed process for analysis of exposure has illustrated how the E values are accumulated and how specific exposure reduction options may be identified. Generalised approaches to the reduction of the exposure of a population can also be identified. The process of redrawing shows that even where alternative subsystems can supply higher systems, if there is a locus of weakness that is common to alternative subsystems, the exposure analysis will ensure that such facts are preserved and the E1 contribution will actually appear in the exposure value for the whole target consumer (group). If the "O" ring seal failure had been identified as a contributor to the E1 J o u r n a l P r e -p r o o f exposure of the Thiokol Solid Booster Rocket then the elimination of all E1 values for the parent system that was the Challenger space shuttle, could not have been achieved unless the "O" ring weakness were addressed. The use of an exposure metric therefore potentially addresses the colloquial saying "it's always the smallest things that get you". One of the learnings from the analysis of the "Challenger" tragedy was that even known subsystem weaknesses could become lost as analyses were progressively summarised. When genuinely independent, alternative sources are available, their effect on the next-highest systems (process, consumer or group) is illustrated by the operation of an "OR" gate. If 4 genuinely independent/alternative subsystems that each have exposure vectors with non-zero values of E1, are combined via an "OR" gate, the higher system does not see any exposure at above the E4 level from that source. This principle might be qualitatively or intuitively perceived, but the effect upon an analysis of exposure provides a starkly quantitative analysis. It may also be observed that while reducing the exposure of each subsystem would require separately addressing each contributor to the subsystems E1, E2 and E3 values (potentially a major task), if 4 genuinely alternative subsystems exist then the combined "OR" exposure has no nonzero contributor more serious than E4 and may warrant no further action. Whereas many 20th century devices were designed for a single purpose and high throughput (mass production), some recent trends have been to devices that can be repurposed -and the pinnacle of flexibility is the human being! For any case where a piece of equipment contributes to the E1 value (is a single point-of-failure), if the capability of that equipment were able to be undertaken by multiple options (e.g. J o u r n a l P r e -p r o o f alternative human operators), the exposure contribution may be reduced to the En level, where n is the number of alternatives or persons capable of undertaking the equipment's function. An illustration may help to clarify this principle: if a sensor provides input that is required by an upstream function, that upstream function and all higher dependencies are exposed to the capability of the sensor to provide input: If 'n' humans are able to provide their best assessment of the sensor's input, then the exposure of the higher system to that functionality is reduced to the En level. Considering the high-exposure technological fields that were identified earlier, generalised approaches to reducing the accrued actual exposure include standardisation of specifications that allow competitive supply of complex components and complex substances, avoiding the large exposure of some contributory systems, retention of genuine alternatives (e.g. cash/gold as well as electronic transactions). There is an intuitive appreciation that it is undesirable to have high vulnerability levels for systems that affect large numbers of persons. Some may also intuitively appreciate that trends to centralisation, driven by economies of scale, increases the technological vulnerability of large population groups. The analysis of population exposure and the principles of exposure analysis therefore provide a quantitative approach that can be used not only to assess the level of exposure of current systems, but more importantly to show quite widely applicable principles for exposure reduction. Specifically the analysis shows that centralisation of production almost inevitably J o u r n a l P r e -p r o o f creates higher population exposure values, and provides a sound theoretical basis for promoting decentralisation of production as an approach for the reduction of population exposure. It is important to consider the practicality of population exposure reduction by decentralisation; this is reviewed within each of the fields previously identified as incurring high exposure: a) Complex components. In order for this functionality to be genuinely available at a significantly decentralised level, we can consider the sub-systems that are required, and the current level of maturity of technology options within each of those subsystems. For a complex component, relevant subsystems include those associated with the supply of materials and the creation of complex shapes from basic material stocks. For many cases, assembly of components is likely to be straightforward but we must also consider cases where assembly itself requires specialised equipment. For the majority of cases, the composition of the complex component can be assumed to be uniform, however cases where composition is not uniform (e.g. multilayer circuit board) must also be acknowledged. Equipment for additive and subtractive manufacturing is available; specifically 3D printing equipment is readily available at small scale, and large scale implementations (Bassoli et al. 2007 , Kraft et al. 2013 ) have been tested. A moderately standardised format for 3D printer design information exists as ISO 14649-1:2003, and ISO 10303:2014 (many sections) , and instructions for operating additive and subtractive manufacturing equipment are such that high skill levels are un-necessary. A significant body of designs (Pinshape™,  Complex substances. In order for complex substance synthesis to be genuinely available at a significantly decentralised level, we can consider the types of complex substances that are of interest, the sub-systems that are required for each, and the current level of maturity of technology options within each of those subsystems. Broadly, the complex substances could be categorised as: Complex alloys, complex inorganic liquids (oils, detergents, etc.), complex organic materials (pharmaceuticals, insecticides, herbicides), complex substances derived from living organisms -yeasts, vaccines, fermentation bacteria, and Polymers. For a complex molecular substance, it is currently common for a range of supply chains to bring raw materials to a synthesis plant. A sequence of synthesis steps (including unit operations of heating and cooling, separation, reaction, precipitation and dissolution) are carried out to generate a complex substance. For "organic" compounds, temperatures are generally limited to below 120ºC. For a complex metallic component, it is currently common for granulated pure metals or metal-compounds to be melted together, sometimes in vacuum or inert gas, and then a controlled set of cooling/holding/reheating steps are used to generate the final material. Considering whether these syntheses could be practically decentralised, Drexler's seminal paper (Drexler 1981) considered the options and practicality issues associated with general-purpose synthesis of complex substances. Since 2010, the "Engineering and Physical Sciences Research Council" (EPSRC) have been funding a "dial-a-molecule" challenge which runs parallel to the work of others (Crow 2014 , Peplow 2014 , and commercial ventures such as "Mattersift™" (Manipulating matter at the molecular scale) have demonstrated the progress towards this goal. Even where general-purpose synthesis capabilities are not available, the availability of knowledge does in principle allow relatively complex syntheses e.g. Daraprim™, to be undertaken by small groups (Hunjan 2016) . For J o u r n a l P r e -p r o o f inorganic materials, progress has been made with solar pyrometallurgy (Neelameggham 2008) and although much development is needed, there would seem to be no fundamental reason why the required temperatures and composition constraints could not be achieved on small scale and with limited equipment. Published proposals (Crow 2014 , Peplow 2014 ) have proposed that it is possible to build a standardised facility capable of carrying out an arbitrary sequence of unit operations required to make any organic compound. While the technologically maturity of decentralised synthesis of complex materials is lower than the decentralised production of complex components, many of the processes are actually feasible at present, and others are rapidly maturing.  Finance: Decentralised tokens of wealth (tokens of exchange) have existed for as long as societies have existed. In order for a token of exchange to continue to avoid the large exposure of current centralised financial systems, a token must retain the qualities of ubiquitous acceptance, transparent and irrevocable transactions; this is currently feasible. Blockchain technology (Swan 2015) has recently offered another decentralised system for secure and irrevocable transfer of wealth, allowing broad acceptance and thus meets the criteria for acceptability. Blockchain-based currencies are however reliant on a communications system that is currently highly centralised, and so fall short of the security expected and exist in numbers of as-yet-incompatible forms. The difficulties with current blockchain technologies seem to be solvable, and this technology offers a promising approach to decentralised exchange of value. Current, high-security banknotes and bullion fulfil many of the requirements for a decentralised approach to transactions, and do not incur the exposure that is inherent with blockchain-based currencies, but do incur a high risk of theft and the exposure of a physical transmission system if the transaction is to span significant distance. J o u r n a l P r e -p r o o f  Communications: A communications system could be considered decentralised (within the population size envisaged) when it has no E1 value above zero, and is capable of communicating with any other (decentralised) population. Secure encryption is currently possible (Schneier 2015) , and despite some practical difficulties, mature approaches such as one-time-pads, seem to be proof against even projected (de Wolf 2017) technologies. Radio transmission on shortwave bands (using ionospheric reflection) of encrypted material can be received globally. Assuming a 10 MHz "open" band, and a very narrow bandwidth (and very slow transmission rate), many channels would be available in principle. This is a low level of practicality, but is must be noted that completely decentralised communication is inherently feasible. Massively-redundant fibre-optic cable systems with decentralised routing systems are also technically feasible, and it is even feasible to consider significantly higher levels of design redundancy for undersea cables. While both feasible and practical for land-based systems, massive design redundancy appears to be feasible for undersea routes but not practical for the exposure level sought. Selfdiscovering radio-based mesh communications for land-based systems are (Author 2007) feasible at present and are likely to be more practical and economical than massively redundant fibre-optic systems for land-based communications. Hybrid approaches, e.g. using self-discovering mesh for communications within a single land mass, and allowing access to a significant number of undersea cables, could meet the population levels and exposure levels to allow a claim of feasible decentralisation using current technology. suggests that fundamental breakthroughs may not be likely. So-called "Ultracapacitor" technology (De Rosa et al. 2017 ) may or may-not overtake battery storage for smaller power levels in the future. The decentralised production of biofuel using macro-algae (Zev et al. 2017 , Chen et al. 2017 ) is also immature, but the capability to treat sewage and create storable hydrocarbon fuel with high energy-density is promising and is perhaps the decentralised heating-energy storage mode that is closest to technical maturity at present. Information. The information storage requirements for a community could include significant medical information, all data required for manufacture of complex components and synthesis of complex substances in addition to contractual, financial, genealogical etc. data. Human-readable information storage (books, tablets) have very low exposure and high decentralisation currently. A decentralised and low-exposure approach for storage of machine readable information does J o u r n a l P r e -p r o o f however pose a non-trivial technological challenge. Existing computer hard-disk drive storage technology is mature, but this approach has a very high technological exposure since the format of storage and the recovery of data is via a complex and proprietary system, and the actual storage medium is not user-accessible. A criterion of E1<1 could in principle be achieved by massive redundancy, but in practice the use of identical operating systems and software make it likely that residual exposure will remain. Technologies such as the 5D glass storage approach (Kazansky et al. 2016) or proposals by the Longnow foundation; or organisations such as "Nanoarchival™ technology" avoid the reliance upon proprietary data retrieval systems and provide very long life -but still lack a mature and decentralised datawriting approach, and a mature and decentralised approach for reading stored data back into a machine-readable form. The advances needed in order to achieve a durable, low-exposure storage medium are immature but are technologically achievable. Other: It is useful to note many other fields in which significant capabilities can be achieved with simple components: while far short of laboratory quality, the principles of spectrography and chromatography (Lichtenegger et al. 2017 , Ghobadian et al. 2017 are actually accessible at a domestic level, and microscope (Prakash et al. 2017) (Jones 1979) and Mars explorer spacecraft. Organisations such as the Longnow Corporation have considered the requirements for durable engineering, and plan items such as a "10,000 year clock". In the context of this paper and earlier studies (Author 2010), we can consider that it is practical to design at least some classes of components and equipment for a usable timeframe that is demonstrably longer than the time required to re-develop a production facility to recreate them; this criterion is a valid test of low exposure practicality. In summary: Low exposure and decentralised options for a range of technologies have been examined: Some are mature and some have a low level of technological maturity. It is likely that some could mature rapidly (e.g. additive/subtractive manufacturing, self-discovering communications networks, information storage), while others such as energy storage have absorbed enormous R&D effort already with limited progress. This does not mean that a trend to decentralisation cannot begin, simply that a decentralised society may have more sophisticated capabilities in some fields than others. The ability to form fixed-duration, ad-hoc associations to enable some largescale development are not only possible for a society comprising decentralised groups, but are considered to be essential. This conclusion differs from the assertion by Jebari & Jebari (2015) that "…isolated, self-sufficient, and continuously manned underground refuges…" are appropriate, rather proposing intentionally independent population units who control their specific interactions with other units, achieving reduced vulnerability but without forfeiting the capability to aggregate selected resources for mutual gain. Smaller-scale concepts such a crowdsourcing approaches are already demonstrating somewhat similar options for shorter-duration, ad-hoc design advances. Situations such as ships travelling in convoy are common and perhaps form a close analogy, demonstrating both the independence of each and the possibility of cooperation. This paper focussed on broad issues of technological vulnerability, but implementation details (resource usage efficiency, waste creation/treatment, gross greenhouse gas emissions and many other issues are also acknowledged. Sociological/anthropological research (Dunbar 2010) has indicated that quite small populations provide a sufficient sociological group size for most, and this conclusion seems to remain broadly valid even where high levels of electronic connection are possible (MacCarron 2016). The concept of a technological system's exposure provides both a tool and a metric, which can be applied to either individuals or to populations of individuals, and can supply useful data to the forecasting process. Population exposure is found to be high for many categories of current technological systems, and the current trend is to increased levels of population exposure. The categories that have been considered as typical, span and affect many goods and services that would be considered essential. Items as diverse as internet usage and financial transactions already have multi-national levels of population exposure (and hence high levels of centralisation) and although various authors (Diamond 2005) J o u r n a l P r e -p r o o f does not use the term "exposure" nor precisely describe the concept, he explains that environmental damage actually has a global exposure level. Population exposure is a topic that does generate intuitive awareness of vulnerability: it can be observed that awareness of technological vulnerability exists at both national level (see J Critical Infrastructure Protection, Pub Elsevier) and at individual level (Slovic 2016 , Martin 1996 , Huddleston 2017 , Kabel & Chmidling 2014 , Reiderer 2018 . Since a measure of exposure correlates closely with the effort required to protect a system, a continued trend to centralisation and increased population exposure is very likely to lead to progressively Herculean efforts to ensure that vulnerable loci are not attacked; such efforts are likely to include progressively wider surveillance to identify potential threats, and progressively stronger efforts to control and monitor access -each of which are themselves factors that can serve to make the service practically unavailable to individuals. If no value were attached to high and rising levels of population exposure, then highly centralised options are likely to continue to be preferred -but the consequences of the high and rising population exposure are demonstrated by reference to a metric of exposure, are also intuitively understood and are illustrated by the "continued centralisation" option in Figure 3 . If there were indeed no technological options for reducing the level of population exposure without major reductions in the levels of technological sophistication available, then large populations are indeed exposed to the danger of losing access to services. That scenario would indeed result in a catastrophic situation where a major reduction in sophistication of services occurs, but the remnant J o u r n a l P r e -p r o o f exposure is also reduced. That scenario has been labelled as the "apocalyptic survivor" scenario in Figure 3 . The third alternative shown in Figure 3 , is for progress towards a significantly decentralised society, with a comparatively low population exposure and a level of sophistication (in terms of technological services and goods available) that does not decrease and actually has the real possibility of advancing with time. In this paper, the description of "forecastable options" has been considered under each of the categories of high population exposure, and it has been noted that for each of them technological developments that enable significantly reduced population exposure exist. The current level of technological maturity of these options vary, but none are infeasible and complement the proposals advanced by a number of authors including Gillies (2015) and Blühdorn (2017) . J o u r n a l P r e -p r o o f In the course of considering decentralisation options, this paper has identified a small number of technological capabilities that would facilitate a more decentralised option, but which are currently at a low level of technological maturity. These include:  Further advances on general purpose chemical synthesis  A durable, open-source machine-accessible information storage system that can be created and read in a decentralised context (requiring only minor development)  An self-discovering network using an open-source approach to allow information transmission (requiring some development)  A large-capacity, durable and economically accessible energy storage technology (requiring significant development)  Further development of trustworthy and decentralised financial transaction systems (requiring some development).  A non-technological system to allow ad-hoc cooperation between decentralised groups, allowing resource aggregation without long-term centralisation (requiring significant development). Despite the inevitable variations in technological maturity across a broad range of technologies, the analyses have concluded firstly that centralisation and high population exposure result in severe and increasing vulnerabilities for large numbers of persons, and secondly that the combination of maturing decentralised technological capabilities and the storage of knowledge allows a transition to a "sophisticated decentralisation" model to be considered as a serious option. While not the primary topic of this paper, it is noted that even in the presence of technological options, change may not occur until some triggering event occurs; events that could trigger a more rapid transition to a "Sophisticated decentralisation" could J o u r n a l P r e -p r o o f include a truly major and long-term disruption of some highly centralised technology such as undersea cables or an irremediable malfunction of international financial systems, or a pandemic requiring high levels of population isolation 1 . The analysis of technological exposure and reduced exposure options have concluded that practical options for substantial decentralisation exist, or could be reasonably forecast as possible. It has also been proposed that there are substantive and immediate reasons to consider a qualitatively distinct "fork" from the high population exposure of current centralised society to a more decentralised model. Any selection of a decentralised technological model might be triggered by some event which crystallised the exposure of a highly centralised model: The drivers that had produced the centralised model will still remain however, and will arguably tend to cause a re-centralisation without ongoing efforts. This issue is outside the scope of this paper, but is noted as a topic requiring further research. The capability for decentralised and machine-accessible storage of knowledge and the creation of complex components and substances has recently, or will soon, create a cusp at which local equipment is capable of reproducing itself. Sufficient computation facilities at local scale are already able to make genuine advances, and with equipment capable of replication, is sufficient to allow fully sustainable, sophisticated and decentralised communities to diverge from current trends. Declarations of interest: none Consumer preferences for household-level battery energy storage On Some Recent Definitions and Analysis Frameworks for Risk, Vulnerability, and Resilience 3D printing technique applied to rapid casting Confronting future catastrophic threats to humanity Dark Ages II: When the Digital Data Die Eco-political hopes beyond sustainability. Global Discourse Existential risks: analyzing human extinction scenarios and related hazards Calling Dunbar's numbers Colombian ancillary services and international connections: Current weaknesses and policy challenges Digital Vellum Macroalgae for biofuels production: Progress and perspectives Undersea cables and the future of submarine competition Addressing the Harm of Total Surveillance: A Reply to Professor Neil Richards Mechanics of turn-milling operations The anything factory Sustainable Growth-an impossibility theorem The Knowledge: How to Rebuild Civilization in the Aftermath of a Cataclysm Black Code: Censorship, Surveillance, and the Militarisation of Cyberspace Collapse: How Societies Choose to Fail or Succeed Cascading blackout overall structure and some implications for sampling and mitigation Molecular engineering: An approach to the development of general capabilities for molecular manipulation How Many Friends Does One Person Need?: Dunbar's number and Other Evolutionary Quirks Intel's management engine is a security hazard, and users need a way to disable it Hype and heavy tails: A closer look at data breaches How to Hack a Turned Off Computer, or Running Unsigned Code in Intel Management Engine The Protocols Kinematic Self-Replicating Machines An Innovative Homemade Instrument for the Determination of Doxylamine Succinate Based on the Electrochemiluminescence of Ru(bpy)2+ 3 Paul Mason's PostCapitalism Emerging technologies, catastrophic risks, and ethics: three strategies for reducing risk A critical review of cascading failure analysis and modelling of power system In: Near-Net Shape Manufacturing of Miniature Spur Gears by Wire Spark Erosion Machining. Materials Forming, Machining and Tribology Leontief based model of risk in complex interconnected infrastructures Weaving the Web: Otlet's Visualizations of a Global Information Society and His Concept of a Universal Civilization Long-Lost Satellite Tech Is So Old NASA Can't Read It. Tech & Science. NASA IMAGE. Observer. NASA Satellite Lost for 13 Years Recovered by Amateur Astronomer When Do States Disconnect Their Digital Networks? Regime Responses to the Political Uses of Social Media. The Communication Review Conceptual design and dimensional synthesis for a 3-DOF module of the TriVariant-a novel 5-DOF reconfigurable hybrid robot Prepper" as Resilient Citizen. Responses to Disasters and Climate Change Daraprim) drug's key ingredient recreated by high school students in Sydney Industrial automation systems and integration --Physical device control --Data model for computerized numerical controllers Automation systems and integration -Product data representation and exchange. It is known informally as "STEP Existential Risks: Exploring a Robust Risk Reduction Strategy Automatic Fault Protection in the Voyager Spacecraft Disaster Prepper: Health, Identity, and American Survivalist Culture Eternal 5D data storage via ultrafast-laser writing in glass NASA Tests Limits of 3-D Printing with Powerful Rocket Engine Check. NASA engineers prepare to hot-fire test a 3-D printed rocket part on Test Stand 115 at the Marshall Center Recent progress on cascading failures and recovery in interdependent networks Visible light spectral domain optical coherence microscopy system for ex vivo imaging Longnow foundation Total Surveillance, Big Data, and Predictive Crime Technology: Privacy's Perfect Storm Is Your Algorithm Dangerous? Solar pyrometallurgy-An historic review Electronic Payment Systems. Artech House Organic synthesis: The robo-chemist. The race is on to build a machine that can synthesize any organic compound. It could transform chemistry Permanent Archival solution: NanoArchival provides ultra-long term archival solutions to enterprise customers, based on its proprietary A Dangerous Trend of Cybercrime: Ransomware Growing Challenge Population dynamics with and without selection Optical microscope 3D printing based on imaging data: review of medical applications Doomsday Goes Mainstream High performance spiro ammonium electrolyte for Electric Double Layer Capacitors Emerging Threats: Outer Space, Cyberspace, and Undersea Cables Von Neumann machine. Encyclopedia of Computer Science. 4th. Pages 1841-1842 Monsters in cyberspace cyberphobia and cultural panic in the information age. Information Applied Cryptography, Second Edition: Protocols, Algorthms, and Source Code in C, 20th Anniversary Edition ISBN Information. Online ISBN: 9781119183471. Print ISBN A new rationale for returning to the Moon? Protecting civilization with a sanctuary A thesis submitted to Faculty of Graduate Studies and Research, McGill University in partial fulfilment of the requirements of the degree of The Perception of Risk Energy and Civilization; A History Hulking transformers prove a challenge for vulnerable grid. E&E 2015-11-24 and see Energy News Digest for Blockchain: Blueprint for a New Economy Censorship and Surveillance in the Digital Age: The Technological Challenges for Academics The Designer's Guide to Disk Drives The potential impact of quantum computers on society. Ethics and Information Technology High-yield bio-oil production from macroalgae (Saccharina japonica) in supercritical ethanol and its combustion behaviour