key: cord-0035564-zt6q2h1c authors: Davies, Wayne K.D. title: Healthy Cities: Old and New Solutions date: 2015-03-24 journal: Theme Cities: Solutions for Urban Problems DOI: 10.1007/978-94-017-9655-2_13 sha: c183924cc1d4de9e814b72461d2ef6665dc968a7 doc_id: 35564 cord_uid: zt6q2h1c Cities have had historically higher mortality and morbidity rates than rural areas. These spiked dramatically after the Industrial Revolution, which led to the first health of cities movement that stimulated the adoption of new public health measures to improve the built-environment in the developed countries. Together with such additional factors as increasing prosperity, hygiene and especially medical advances, the old health disadvantages of cities was reversed. But a new set of medical challenges threatening to reverse previous progress has emerged. These include such problems as bacterial resistances to many of the drugs that reduced communicative diseases, to the effects of indoor living and aging, all of which require urgent attention. In addition, a review of the various health determinants that contribute to ill-health shows that since many of these factors are not within the prevue of current medical practice, they must be addressed if the health and well-being of people in cities are to be improved. A series of other problems that were previously overlooked are being tackled by the new Healthy Cities movement, such as the need for more political and citizen involvement in planning and delivering health care, better ways of promoting health rather than just curing ill-health with a new emphasis on wellness, as well as more effective measures to reduce the persistent pockets of ill-health in many cities. Throughout most of history towns and cities had higher death rates than those in rural areas, creating what amounted to an urban health disadvantage. This was largely because of their densely packed and insanitary conditions, which made their residents especially prone to diseases spread by contagion and contaminated water. So the very creation of these unhealthy man-made environments was responsible for much of the increases in morbidity and mortality. Of course there were times when some of these problems were reduced. For example, from the mid-third century BCE some of the buildings in major cities in the Indus valley, such as Mohenjaro Daro and Harappa, had installed drainage systems and even running water to remove human waste in rooms we would call toilets (Possehl 2002) . In subsequent centuries civilizations such as the Minoans in Crete built underground clay pipes for water supply and sewage removal. But it was Roman engineering in the centuries before the common era that endowed towns and cities in their empire with the most sophisticated water systems to date, bringing clean water from long distances to their settlements, adding water fountains and public toilets near town squares, and encouraging cleanliness through the many public baths, although their use of easily worked lead for some the water pipes did produce some poisoning. Moreover, Vitruvius's classical book on architecture (Morgan 1960, p. 21) , probably written during the time of Augustus, included advice to town builders to make sure their settlements were not sited near marshy ground which were known to be prone to fevers and ague, probably what we now know as malaria. These technical advances in various early civilizations were often forgotten or ignored by subsequent urban builders. The result was especially disastrous in the rapidly expanding, but unsanitary and overcrowded large towns created in the Industrial Revolution where levels of mortality and premature death reached such critical levels that political action was finally taken to solve the problem, primarily through sanitary, water supply and building regulatory policies. These policies produced the first major changes to improve the health of urban residents over the last two centuries, at least in the countries of what became the developed world, and should be known as the first healthy city movement. The second major change in the health of people in urban places developed from the late nineteenth century onwards. Impressive reductions in mortality and disabilities occurred in part through reductions in family size, better hygiene and nutrition and healthier physical environments because of the public health interventions, but also by unprecedented increases in the medical profession's ability to cure people from many diseases and impaired bodily functions. Together with improvements in prosperity it removed the historic urban disadvantage in health, so that urban areas in the developed world had longer longevity rates than those in rural areas. By the end of the twentieth century doubts began to be cast on the ability of existing practices alone to further improve the health of the population, especially in urban places. Three new sets of challenges can be recognized. First, new medical difficulties have arisen that threaten to turn back the gains of the past centuries, such as disease-causing bacteria that are resistant to drugs, and the increase in new chronic diseases. They not only pose problems in poor developing countries, where they are compounded by inadequate sanitary and clean water facilities and poor building structures, but in the developed world as the population ages and needs more care. Second there is a revived emphasis upon the so-called 'determinants of health' the factors that influence health and ill-health, many of which are environmental in nature and are not always effectively dealt with by individual medical care procedures and affect health over the long term. For example, one of the biggest contemporary problems is air pollution, especially from fossil burning in urban areas and in households with open fires. A recent World Health Organization report (WHO 2014a) estimates that over 7 million deaths a year occur from this cause, not directly, but by creating premature deaths from heart disease and respiratory tract cancers. The long term effect of this and other factors means there is a need to focus as much upon the reduction of these problems and the promotion of health, as upon treatments of diseases or decaying body functions. Third, various agencies, from the World Health Organization's Healthy City Programme that began in 1986, to local governments and community groups, have been advocating new ways of improving health in cities. These amount to new organizational changes and targeting of particular problems and have a strong urban basis, for this is where increasing numbers of people live. Like so many of the new urban themes that have been advocated in recent years, the approach stresses the need for a commitment to look beyond current practice in health care, as well as the need for political, community and wider stakeholder involvement as this WHO statement shows. Being a Healthy City depends not on current health infrastructure, rather upon, a commitment to improve a city's environs and a willingness to forge the necessary connections in political, economic, and social arenas…. It aims to: to create a health-supportive environment; to achieve a good quality of life; to provide basic sanitation and hygiene needs; to supply access to health care. (WHO 1998) The emphasis upon factors other than the existing health care system requires a more holistic view of the various determinants that affect the health of the population in urban areas. In many ways it represents a new emphasis upon a different type of public health approach, one which has gradually declined from the 1880s after the success of the sanitary approach in improving the physical environment. This has has strong links with emerging ecological ideas (Ashton and Ubido 1991) . This chapter begins the review of all these changes by summarizing these new attitudes to health, not just ill-health, before reviewing the nineteenth century urban improvements that helped reverse the high mortality rates. This is not included only for historical reasons but because it provides an exemplar of policies that are still needed in many cities in developing countries today if the health of their urban residents is to be improved. The historical review is followed by sections providing a summary of the very serious new challenges to medical care, followed by descriptions of the policies that can be used to improve the various health determinants and then the initiatives of the Healthy City movement. Although many of the policies may be made and financed by national governments, there is not only an important urban component in the delivery of the services, but also the increasing realization that life in an urban habitat may be creating new health problems. Health is not an easy concept to define. Superficially it seems to relate to the idea of a population that is disease-free. However, the constitution of the World Health Organization has described health in a wider way: "Health is a state of complete physical, mental and social well-being, and not merely the absence of illness" (WHO 1948) . The trouble with this health definition is that the use of the word 'well-being' covers much of the human condition and does not lend itself to easy measurements. Nevertheless the phrases 'well-being' and 'absence of illness' do mean a wide view of health issues is envisaged. Later this led to a more wide ranging statement about health promotion, as much as curing illness, from the European office of the WHO in its Ottawa Charter for Health Promotion in 1986. Health promotion is the process of enabling people to increase control over, and to improve, their health….To reach a state of complete physical mental and social wellbeing, an individual or group must be able to identify and to realize aspirations, to satisfy needs, and to change or cope with the environment. Health is a positive concept emphasizing social and personal resources, as well as physical capacities. Therefore, health promotion is not just the responsibility of the health sector, but goes beyond healthy lifestyles to wellbeing. (WHO-E 1986, p. 1) This description emphasizes that the health sector cannot solve health problems alone. It has led to the popularization of the concept of 'wellness', which is being used to describe a more holistic approach to the improvement of health, emphasizing health promotion, as much as treatment. Health is not simply the absence of illness and disease, but as something we build with our families, schools, communities and workplaces, in our parks and playgrounds, the places we live, the air we breathe, the water we drink and the choices we make. (Alberta Health 20142014, p. 4) So there is a new emphasis upon prevention, of encouraging healthy living, not just treating our way out of ill-health through existing and future medical knowledge. Yet a more healthy living is not the only benefit; the work of researchers discussed in the Knowledge City discussion (Chap. 11) has shown the link between I.Q. growth and disease reduction, which means a healthier population is likely to be a more productive and innovative one, which assists future economic and social progress (Eppig et al. 2010) . These new perspectives have led to the need to pay more attention to the many factors, or health determinants, that affect health in both the long as well as the short term. They lead to interventions designed to create healthy cities that are not limited to the current health care system. It was well known throughout history that urban places were less healthy than rural areas and from the seventeenth century some individuals provided quantitative evidence of these differences. For example, John Gaunt (1662) in England showed that 1 in 30 died annually in London, whereas the rates in the country were 1 in 50, with a third of deaths from infant mortality. Plagues were particularly feared. Although always present, there were extreme epidemic plague years, usually at 10 or 20 year intervals in London but with even more severe outbreaks, such as those in 1563 and 1625 where deaths increased five to six times above normal rates (Harding 2012, p. 31) . Limited medical knowledge meant that there was no clear understanding of the causes of the high mortality but most attributed it to miasma, the effects of foul air and filth in these densely populated, unsanitary urban areas. Within cities there were important variations in mortality rates as shown by Villermé's pioneering study in Paris in 1817 which showed how wealthy areas, such as the second and third arrondissements, had annual mortality rates of 1 in 62 and 1 in 60 respectively, whereas the two poorest areas, arrondissements 11 and 8, had higher rates of 1 in 43 (La Berge 1992) . In recent years these historical spatial variations within cities have been investigated more thoroughly. For example, it has been demonstrated that the richer areas in the centre of seventeenth and eighteenth century London, with its more substantial houses and a better fed and clothed population, had far lower mortalities than warmed rooms, but led to indoor pollution. Outside the buildings the emissions of gases and particles from coal fires often created choking fogs, full of particulates, in damp, river valley environments subject to inversions, adding to the respiratory risks. This was related to a sixth problem, the often dangerous working conditions in mines and industrial plants, or even in home-based industries, which led to deaths or injuries from machinery or various industrial processes, especially those that gave off noxious fumes or by-products. A seventh problem came from the increase and concentrations of poor people in the cities. Their poverty was due to the limited wages and often intermittent work, which left most with limited means to buy nutritious food, warm clothing or adequate shelter. Combined with poor hygiene it made them more prone to disease. An eighth problem came from the limited medical knowledge and access to it at the time. This was made worse by the reluctance of many to embrace new ideas, while the poor had too little money to even get medical assistance, unless it was from charity provided by church organizations. The effect of these problems on the lower class in particular was calamitous, as seen in a description by Engels of the labouring classes who were: … For the most part, weak, thin and pale. Their weakened bodies are in no condition to withstand illness and whatever infection is abroad, they fall victims to it. Consequently they age prematurely and die young. (Engels 1844, pp. 118-119) These descriptions by Engels and the more substantial works 50 years later by Charles Booth (Pfautz 1967; Davies 1978) , who produced detailed maps of the degree of poverty in each street in London, district measurements of the degree of social condition in major districts (Davies 1978) , and seventeen volumes on the conditions of life and work-places in the metropolis, identify many of what we now call the social determinants of health. Only a few in these slum areas were able to escape the environment within which life was lived, which created the milieu for urban ill-health and premature mortality, although conditions were better in the more prosperous areas of well-built and maintained houses. Hence, the physical build and social environment in which so many of the disadvantaged lived lay at the root of their ill-health and early mortality. The work of Chadwick, the descriptions of novelists and social commentators, as well as doctors such as Snow (Frerichs n.d.) , in exposing these problems eventually led to increasing political debate about the need to improve these environments (Berridge and Gorsky 2012) . Some of the pressure for change came from altruism, the view that we are all humans and that the poor should be helped, a particular belief among Christian religious organizations. Also important in supporting built-environment changes was the realization by many in the political classes that an unfit, weakened population needed to be improved if the country was able to assert and defend itself by healthy effective armed men. These reasons, plus what amounted to panic from the high spikes of mortality, especially during the cholera outbreaks, also meant the better-off felt threatened. It is also important to emphasize the role played by associations such as Health of Towns Association established in Britain in 1844. Branches were quickly established by business and social elites in most major centres. Their members organized public meetings to present facts on mortality variations and exerted great pressure on governments to create laws that led to the development of sanitary and clean water systems in towns, as well as Public Health officials to monitor and often agitate for change (Ashton and Ubido 1991) . They helped promote the idea that there must be improvements in the physical fabric of towns and their infrastructures. Yet there were still those who believed the fault for the conditions in these areas of squalor lay in the poor themselves, either because people were too lazy or immoral to improve their condition, or they were simply inferior and prone to disease. Four main sets of changes were crucial in transforming the urban health conditions of many settlements. The main one consisted of improvements in the physical fabric of cities, helped by a series of major public health acts and new technologies. For example, in Britain, the 1845 and 1875 Public Health Acts laid down enforceable building standards, not simply in construction but in minimum sizes of houses and lay-out, plus tarmac roads, as well as the requirement for small gardens and at least outside toilet facilities (Hall 1988) . Critical in these changes were the additional requirements for a clean water supply and new sewer lines to connect the housing areas to plants to disperse and process the sewage, although initially many dumped it into the nearest water body, where many obtained their water. Increasingly it was the responsibility of the municipal government to provide such facilities, in addition to fire protection, garbage collection services, and the banning of animal husbandry in urban areas to reduce their waste. This led to greater powers for local governments in providing these services, replacing previously inadequate and sometimes private provision, as well as the important addition of Medical Officers of Health to monitor health conditions in local areas. Although often known as the 'sanitation phase' it may be better summarized as a regulatory and engineering phase of improvement. It was the engineering advances in supplying water from often distant reservoirs, and then filtering water-first through sand and later disinfection with chlorine to kill microbes-that led to clean urban water supplies, which were later helped by the invention of engines that allowed water to be pressurized for further distribution to residences. Another major change came with the development of toilets, using the new water supplies, where the flushed water containing human waste, was carried away by new sewer systems that also removed other waste water. Later, the addition of sewage processing plants to process the effluent improved the systems, instead of the earlier approach of just dumping it into the nearest water body, although far too many urban places still do. Also the new regulations included effective inspection systems that controlled and improved building quality and industrial plants, and eventually, the safe disposal of toxic by-products. Overcrowding was gradually reduced by decisions to eradicate older slum areas and build at lower densities, while the development of unions gave workers more ability to argue for workplace safety, as well as better wages. The creation of a better physical urban environment helped reduce the spread of water and airborne communicable diseases. A second set of factors dealing with broad social changes in health improvement, were also influential. Lower fertility rates decreased family size and reduced child mortality. Increases in prosperity led not only to better nutrition from more food supplies, but better clothing and housing. More cleanliness and a knowledge of hygiene was also important. In addition there was a gradual and crucial improvement of medical knowledge and nursing capabilities. A third trend came from greater acceptance of a new valuation of nature and appreciation of the utility of green space that began with the Romantic revolution in art. This stimulated the creation of public parks for leisure and escape from the noise and filth of the city, and cemeteries for the safe disposal of dead bodies (Reps 1965) . These ideas later led to the ideas of adding green space to the planning of new subdivisions. Restricted at first to wealthy residents, these planning ideas gradually filtered down to middle class developments and to model towns of workers (Davies and Herbert 1993) and eventually to the Garden City ideas. They led to a more comprehensive approach, advocates for a complete town combining the best of the town and country, whose ideas influenced subsequent suburban designs, issues discussed in Chaps. 2 and 4 especially. New ideas about the restorative values of nature also led to the belief that it could contribute to better health, which in turn was assumed to develop an improved morality, unlike the social degeneration seen in cities and especially slum areas which were thought by many to breed moral, as well as physical diseases. In Britain it led to many new movements designed to encourage a more active and outdoors life-style, from the addition of sports and physical exercise in schools, to the Scout and Guides movement from 1908, with their outdoor orientation and camping trips, to the 1926 New Health Society which sought to improve nutrition, as well as to the Sunlight League in 1926, which extolled the health-giving properties of sunlight (Carter 2012) . Similar movements occurred in many countries, especially in Germany where there was greater emphasis on the need for hygiene to be taught in schools and to be part of everyday life, stemming from the work of Dr Weyl's 10 volume treatise on Hygiene that began in 1893 (Schott 2012) . Although many these social movements of the early twentieth century faded during the war period and its austere aftermath, the revival of the environmental movements from the 1960s and the more recent ideas of the nature deficit in children discussed in Chap. 4, can, in part, be regarded as the partial inheritors of these earlier traditions. A fourth transforming trend was less a matter of policy or organizations than of fashion and life-style, namely the benefits of spa and seaside towns. These were regarded as being sited in healthier environments, allowing the opportunity to improve health by sojourns in these centres. The earliest examples came from the expansion of spa towns from the seventeenth century, places with mineral springs which were assumed to have therapeutic properties, obtained by either immersion in, or by drinking such waters (Adams 2012) . Many of the early European spa towns were revivals of settlements on old Roman mineral water sites and were primarily patronised by the elite, which meant they became as much social, as medical, centres. Up to the 1930s they were patronized primarily by the elite and their therapeutic values were highly regarded. But in the U.K. many declined after the creation of the National Health Service in 1948 because this nationalized system was not prepared, except in a few cases, to subsidize patients to attend what were expensive courses of treatment. Although there has been a recent revival of some spas in Britain and the development of many new private ones, it has been concluded that they are based primarily upon methods of treatment, rather than the particular place properties of climate or water (Adams 2012) . However, in continental Europe there has been a much greater continuity and use of many of the old spas, as health providers were more likely to prescribe such courses of treatment for all classes. The bracing qualities of sea air was also recognized and promoted from the late eighteenth century, providing a healthy alternative to the stink, noise and fevers found in cities. Again it was the upper classes who led the initial growth of these centres. With increasing affluence by the end of the nineteenth century, the middle classes, and eventually the working classes after the introduction of paid holiday time, also found temporary solace in these seaside places. Bodily emersion in the sea, exercise by swimming, and the apparently beneficial effects of tanning, were added to the initial atmospheric benefits, although the fear of skin cancer from over-exposure to the sun has reduced the time people spend on tanning in the last 20 years. By the end of the nineteenth century the changes described above began the health transformation of many urban places in developed countries that led to a rapid decline in premature deaths. Soon after, a new phase of improved medical knowledge, training, surgical techniques, the use of new technologies, and the discovery of antibacterial drugs and vaccinations for many diseases, increasingly played a major part in reducing deaths. Although advances in sanitation and environmental improvement were still being made, what became known as the biomedical model became dominant in health care. This model attributes morbidity and mortality to molecular level pathogens brought about by individual life-styles, hereditary biology, or genetics, and it altered public health to personal 'risk' factors such as smoking, diet and exercise. (Corburn 2009, p. 49) Although local government continued the progress of ensuring improvements in the physical fabric, and Public Health officials monitored local health conditions, the growth and dominance of what amounted to germ theory ideas and the success of individual cure and care approach by doctors and hospitals meant it was the medical profession, not city officials, as in the engineering and regulation phase, that increasingly became the main decision-makers in health care. In addition, there was a move away from what is usually called the 'urban field view' in health terms, where residents and professionals searched for the particular qualities of place, caused by the interactions between the various elements, that caused ill-health. This was seen in the early sanitary phase, which led to more context-specific and localized policy responses. Although the move to laboratory sites and techniques produced great results, the reduced interest from what were real world, not controlled conditions, meant that the local milieus that caused or assisted diseases were often underestimated. In addition, the new profession of planning increasingly downplayed the need to put health issues in the forefront of their concerns. Certainly many planning departments emphasized the need for a more ordered and efficient land use distribution, especially to ensure that noxious industry would be separated from residential areas, while ensuring the latter would have various effective public facilities, from roads, sewers, schools, recreational areas etc. These issues were connected to health improvements. But Corburn (2009, p. 41) has described how attempts were made by some late nineteenth century American planners (Marsh 1909) , to argue that the planning profession should focus primarily on the key issues of social justice and the health of cities that had motivated the nineteenth century reformers. Such opinions failed to influence leaders of the new American Planning Association. So efficiency and also aesthetics, such as the City Beautiful ideas, became the main emphasis. Yet planners often assumed that efficient land use planning was itself a way of improving health, especially after acceptance of the Neighbourhood Unit principle, with its insistence on lots of green space, the incorporation of many facilities within the unit, and the positioning of main roads outside the unit. In addition, the removal of slum areas and their replacement with new tower blocks was also seen as a way of improving the housing conditions of the poorest classes and hence their health. However there is more than a whiff of social determinism in planning, since it is assumed that the provision of new housing in these new low density layouts and new high-rises would automatically alter behaviours and improve health. Advocates did not anticipate that the neighbourhood units would became car-suburbs as vehicle ownership increased, contributing to the lack of exercise, or that many of the high estates would become crime-ridden areas with few jobs, limited security and social isolation, creating new areas of deprivation, ill-health and often racial segregation. Hence in health terms a new set of problems emerged in these newly developed urban environments, while disadvantaged areas persisted. These were not effectively addressed by municipal planners. Also the increasing domination of the biomedical model of medicine meant that despite the sterling work of community health specialists, most attention was focused on the cure of the diseases of individuals, not on wider and often long term issues that contributed to these problems, or on the recognition that decision-making on health care issues should involve more than medical professionals. At first sight the impressive gains in medical knowledge, care, and disease prevention, and their constant improvements over the past century, would imply that the health of people in urban areas would continue to improve. However new problems threaten to reduce the life-span of people in coming generations compared to their parents, a potential general health reversal unknown in the last century, except in exceptional circumstances such as war. Among the varied problems, eight in particular, demand solutions through new medical advances and care and some of these may be influenced by the increasing urbanization trends. Table 13 .1 shows that the WHO estimates of the top twenty reasons for the direct causes of mortality in 2015 and 2030 is no longer dominated by the communicable diseases that so devastated the populations of cities especially in the past; most seem to have been conquered. Today the majority of the deaths come from so-called chronic diseases, linked to organ failures, such as heart attacks and strokes, to cancers of various types, and diabetes. These diseases now account for 35 million deaths a year of a total mortality of 57 million, which has led to calls for urgent world attention to be paid to these problems (WHO 2010 ). Yet despite the major advances that have taken place in the treatment of such diseases, especially if the problem is diagnosed early, Table 13 .1 shows they are still predicted to be the highest ranking sources of mortality in the foreseeable future. Moreover, these chronic diseases are no longer restricted to the developed world, for 80 % of the deaths from such causes now take place in low or medium income countries, places that are the least equipped to deal with such problems. Clearly there is need for more research into the cures for these problems, some of which come from long exposures to unhealthy environments or poor life-styles over many years, which means better health education and monitoring. Also more treatments for such diseases in the developing world is needed. Indeed it has been estimated that less than 3 % of the international development assistance going to developing countries is spent on non-communicative diseases (WHO 2010) , showing the continued fixation with diseases spread by infections. A related issue here is the increased costs of providing care for patients with such chronic conditions, which require longer periods of care and greater expense. These changes in the causes of mortality are also related to the general problem of coping with aging, for it is estimated that the global population of those over 65 years will almost double from the current 600 million to 1.1 billion by 2035, resulting in a change from 8 to 13 % of the population. Some of these people will be healthy and active but many will have ill-health and there is predicted to be major increases in the numbers of people suffering from various forms of dementia, such as Alzheimer's, for which there is no known cure at present, except for interventions that help to delay or reduce its impacts. Recent estimates show that 35.6 million people in the world are already suffering from this disease of cognitive impairment and loss, conditions that already leads to health care costs of $ 604 billion (WHO-ADI 2012). The numbers with dementia are predicted to almost double by 2030 and will more than triple to 115.4 million sufferers by 2050. This will require much larger numbers of senior centres and homes to cope with, and treat, the growing number of patients, as well as finding enough qualified staff. It has been known that many life-style factors are correlated with those who have the disease; for example, people who are active and socially integrated and those with higher educational attainments less likely to be at risk. In 2014 a major new study focused on the impact of a series of modifiable life-style factors on the incidence of Alzheimer's, a major type of dementia (Norton et al. 2014) . It revealed that seven factors-physical inactivity, smoking, mid-life hypertension, midlife obesity, diabetes, depression and low educational attainment-were all significantly correlated with the disease and that one-third of cases could be avoided by changes in life-style. For example, those who did not engage in at least three 20 min bursts of rigorous exercise a week were 82 % more likely to develop the disease. The importance of the study is that the estimated huge increase in numbers could be drastically reduced by these life-style changes. The problem of coping with this wave of new cases will be hardest in the developing world, where there are less resources to deal with the problem, especially in cities where there are likely to be fewer family or long term friends to help with care. Although increases in the elderly population is a growing problem in most developed and middle income countries, there are marked spatial inequalities in this effect, with aging in Japan occurring more rapidly than in other developed countries. It is creating one of the most severe current problems, but also provides a warning to other countries. The population over 65 years of age in Japan reached 23 % of the total in 2011 and is projected to be 38 % by 2050 with the population declining by a third from the current 127 million in 50 years. It is estimated that by 2035 there will be 69 persons over 65 years of age in Japan compared to 100 in the working population (25-64 years), up from 43 in 2011, whereas the world today has only 16 in the over 65 age category. Some of the effects of the aging trend may be reduced by raising fertility rates and by people working longer. But these skewed future rates will not only require more health care but will lead to potential GDP losses unless labour is replaced by capital and more immigration is allowed, which in the case of Japan, has never been a policy palatable to the majority. Since caring for this aging population is now a major problem, and many villages and small towns will soon be dominated by old people, Japan instituted a long care elderly insurance scheme. This is funded by compulsory taxes on people over 40 years of age and other taxes. Few other countries or cities facing aging populations are making such provision. Indeed in some countries, such as the U.K., government cutbacks since the 2007 financial depression has led to major cuts in the budgets of local authorities and care for the aged is falling at a time when there is greater need. However in some cities there is a gradual realization of the need to plan for this aging process. For example, in many American cities, such as New York and Cleveland, there are now Departments of Aging in the municipal organization, which are designed to prevent elder abuse and also to provide programmes to assist the elderly, especially by creating on-line websites that list the various services that seniors can access and use to obtain help. A more general approach to mitigate the effect of aging occurred with the WHO's establishment of the Global Network of Age-Friendly Cities and Communities in 2006 (GNAFCC). Initially based on 33 cities in 22 countries the network is designed to provide guidance to other cities wishing to implement policies that promote healthy and active aging by creating an active and accessible urban environment. Key components are the involvement of seniors in the discussions and policy formulation and to provide indicators to measure progress as well as evaluations of their effectiveness in subsequent years. Eight domains of city life have been identified in a guide for age-friendly cities, as well as communities, within urban places (WHO 2007) . These domains are in the fields of: transportation; respect and social inclusion; outdoor spaces and buildings; social participation; housing; information and communication; community support and health services; civic partnership and employment opportunities. Historically many churches, and neighbourhood networks based on long residence, as well as families have provided fellowship and help for seniors. But in a more mobile and secular society these local support systems have decayed, making it imperative to plan for an active older population, not simply to house them. It is tragic that so many old people's homes are built in remote locations in cities, with few places to walk to, or to obtain goods, resulting in the almost warehousing and segregation of old people, rather than building premises to support active living near shopping or transit nodes and also adopting the other recommendations of the WHO's age-friendly city network (GNAFCC) issues largely ignored by New Urbanism and other themes. One of the biggest challenges to modern medicine is to create a more equitable pattern of 'cure and care'. The poorest people in developed countries and the majority in the developing world still have limited access to health care facilities, even of a basic level, with resultant high and unnecessary death rates from diseases where cures are available. One stark example can be seen in Table 13 .1 which shows the persistence of high death rates from the old scourges of childbirth, such as preterm birth problems (11) and birth trauma (18). A large proportion of these 1.9 million deaths occur in rural areas, which could be cured by access to modern medical knowledge and better health care, which should occur with greater urbanization, one of the reasons why both diseases are expected to drop substantially in the next 15 years and result in lower rankings. Much of the reduction in mortality and ill-health rates in urban places has been due to the way that the most widespread communicable disease threats, such as measles, chicken pox, polio or tetanus, have been controlled by vaccinations that provide adaptive immunity against the pathogens that cause these diseases, and by antibiotic drugs that kill bacterial infections. So only respiratory infections (no. 3 in Table 13 .1), diarrhoeal diseases (no. 5) and tuberculosis (no. 13) are found in the top twenty of the major causes of death today (WHO 2012a). In total only a fifth of the annual deaths in the world now come from infectious diseases. Yet the proportions vary spatially, with levels of 40 % in the less developed world to close to only 7 % in developed countries such as the U.K. (Davies et al. 2013, p. 27) . The limited availability of effective health care, antiseptic methods and the availability of vaccinations and antibiotics in the developing world, combined with limited clean water, sanitary facilities, and more airborne pollutants, means that these treatable diseases still kill far too many people, given existing medical knowledge. So there is still a lot of room for real progress in the fight against such diseases in the settlements of poorer countries. Moreover, the current control over the most infectious diseases should not lead to complacency. Some, such as smallpox (Kopolow 2003) have been eradicated, with polio well on the way until new outbreaks occurred in Afghanistan, Pakistan and Northern Nigeria, a result of war and fanatical Islamists preventing vaccinations. Others, such as increasing cases of tuberculosis in cities such as London, seem to come from immigrants arriving with the infection from countries where the disease is still endemic. This means that more effective screening of the health background of immigrants is probably needed. Also, there is always the threat of new outbreaks of many of the diseases largely controlled, because of increasing numbers of people refusing to accept vaccinations. So vigilance to spot new outbreaks and their source, as well as mass vaccination programmes are still needed. There is also the threat of new, previously unknown diseases appearing, with the case of the acquired immunodeficiency virus (AIDS) and (Severe Acute Respiratory Syndrome (SARS) providing opposite examples of early failure and success in coping with the problems. In the former case Holt (2014) has described how the growth of the disease caused by a virus, spread through sexual contact and contaminated blood, which quickly mutates and was difficult to treat. It led to a high of 2.2 million annual deaths. Originally unrecognized from its origin in infected monkeys in Africa its spread from the 1980s was helped by governments, such as South Africa, which refused basic treatment to victims. Today antiretroviral drugs can halt its progression, allowing people to lead an active life, although they are not cured. So since 35.3 million are infected worldwide, the virus is still estimated to be the sixth most common cause of death in 2030 (Table 13 .1) and remains as a major threat. By contrast, the respiratory disease, SARS that spread between November 2002 and July 2003 provides an example of the dangers that can quickly emerge in our connected world (Enserink 2013) . It originated from a species jump of the virus to a Chinese farmer from its host in bats and civets and then to others in local markets and Hong Kong. In this case it was relatively quickly diagnosed when travellers from Hong Kong brought it back to North America due to vigilance of doctors in major hospitals in Canada and the United States, alerted to a new disease in southern China, and its rapid identification by Canadian researchers and The U.S. Centre for Disease Control and Prevention. Victims were isolated in special wards, with their contacts being tracked and warned (Enserink 2013 ). In addition, civets in the southern Chinese markets that acted as hosts were destroyed. The use of masks and other protective clothing by carers, and the careful cleaning and sanitation of all surfaces near infected people, also reduced the spread of the infections. In the longer term, the search for cures is vital. What the example also shows is the need to have large numbers of wards or places available to be turned into isolation rooms if a similar major outbreak occurs-a scale of resources which may still be rare in most urban centres, even in the developed world. In the past, separate quarantine hospitals, and even areas, were set aside in or near cities for people with infectious diseases. For example, North Brothers Island in New York fulfilled this role, especially for immigrants, although it is now abandoned and its buildings are decaying. If infectious diseases cannot be controlled by vaccines and drugs this type of separation may again have to be practiced. The history of the SARS outbreak shows the need for rapid genome typing of a suspected new disease and the ability to quickly create large numbers of isolation wards to quarantine victims. It illustrates how a multipronged and international effort is needed to prevent the spread of new communicable diseases. Yet there is little doubt that there are many diseases that are still unknown, or like the hemorrhagic fever, Ebola, kills 70% or more of infected people and does not have a cure. Until recently it had not spread beyond a few isolated areas of Africa and seemed to be containable. But in 2014 a major outbreak occurred in three countries of West Africa, which has raised the spectre of a major spread. Assistance in dealing with the outbreak is being helped by the addition of skilled international medical teams, but the burial practices of many cultures in the area, plus suspicion of western medical practices and the desire to avoid the stigma of having a family member diagnosed with the disease, handicapped containment, showing how cultural factors and very limited medical facilities impede medical progress. With the exception of HIV-Aids the world has managed to avoid pandemics for almost a hundred years since the so-called Spanish flu, a virus causing a virulent form of influenza that killed at least 50 million and infected 500 million world-wide in 1918-1920. But the new travel contacts and the increasing concentrations of people in urban places, puts this generally positive history of controlling communicable diseases in the twentieth century at greater risk, which is also linked to another emerging problem. This is the title of a provocative and chilling small book by the British Medical Officer of Health and her colleagues (Davies et al. 2013 ). In it she describes an emerging crisis of Antimicrobial Resistance (AMR) caused by the fact that many microorganisms (bacteria, fungus, virus or parasites) no longer respond to the drugs that have been invented to counteract their effect in creating various diseases. Although reporting is flawed, it is estimated that 25,000 people die from such problems in Europe annually from the inability of formerly effective drugs to work, more than are killed on roads. Almost similar levels of deaths occur from AMR in the U.S.A. (CDC 2014; WHO 2014b) . The reason is that the bacteria have developed resistance to the drugs that killed them. It was something that Sir Alexander Fleming warned about after accepting the Nobel prize for his discovery of penicillin in 1923, namely that it is a fact of nature that bacteria and viruses mutate and develop new forms, and those new disease-causing types will be resistant to drugs that previously killed them. Hence no drug is going to be effective forever. The scale of this problem should not be underestimated, for similar conclusions have been reached by other reports on the problem in the U.S.A. (CDC 2O14) and by the WHO (2014b) in its first comprehensive survey of a number of AMR bacteria. If this mortality was linked to some visible, known specific disease, there would be public pressure for a cure. But since the deaths from these resistant microbes are scattered throughout the country, they appear to be almost hidden, so the scale of the problem has been underestimated until the last few years. The detailed tables in the WHO report (2014a) also show that there are great variations between countries in the scale of the problem, which varies with different disease-causing microorganisms. Some of the biggest worries are seen in the case of tuberculosis where 20 % of previously treated cases proved to be resistant to multidrug treatment cases in some countries, while in many places gonorrhoea cases are now being treated with the final drug that works. In the case of Cyprus 36 % of a sample of people tested proved resistant to anti E. coli drugs, while 68 % of sampled people in Greece were resistant to microorganisms that caused pneumonia (WHO 2014b). In addition, South East Asia is a region where new resistances to anti-malarial drugs have emerged. Certainly the sample sizes vary in each case study and cannot be used to conclude they represent the whole population. But the evidence is sufficiently worrying for the authors of the three main reports to date to use the words 'crisis' and even 'catastrophe' in descriptions of the problem. Diseases that were thought to be controlled could return and kill large numbers, while the drugs routinely used to prevent infections after surgery are proving less effective in many areas. In some areas multi-drug treatment has proved effective, but this is very expensive and involves specialised individual treatment that may be the equivalent of $ 80,000 annually, making their use out of the question for poorer people. The problem of AMR is made worse because the major pharmaceutical companies are no longer researching new antimicrobial drugs to replace those previously effective. From 1934 to 1968 14 classes of antibiotics were created, but only five since that time and none since 1987 (Davies 2013) while currently the WHO (2014b) recognizes 27 different types of effective vaccinations. The reason for the decline in antibiotic development is the costs of innovation, perhaps over $ 1 billion in research, testing and satisfying the regulatory agencies. Companies believe the costs do not justify the potential financial return, given that many of the drugs may be needed only once or on a few occasions to cure some infection-unlike the daily drug regimes for many suffers of chronic diseases. Also, limited patent time, and the small returns from poor countries where patients cannot afford the drugs although the need is high, adds to the reluctance to invest in AMR development. So a 'market failure' has occurred, which means the inability of the market to prove a strong enough attraction to create new AMR drugs. Britain's Chief Medical Officer of Health has identified the problem starkly. We are losing the battle against infectious diseases. Bacteria are fighting back and are becoming resistant to modern science. In short the drugs don't work… Our response needs to be global and multifaceted to… manage and mitigate the risk of antimicrobial resistance, which is just as important and deadly as climate change and international terrorism. (Davies et al. 2013, pp. ix-xiii) There have been many suggestions to solve the emerging crisis. One is to resolve the market failure that has reduced research. This can be done by helping companies to finance new drug research, increasing the participation of government and university researchers, as well as encouraging charitable organizations to work with companies in joint venture financing, such as the Medicines for Malaria Venture. In addition, quicker means of identifying new bacterial and viral strains are needed, which is happening through the development of genome sequencing while new serums may be effective. Linked to this is the need to have rapid communication through international health agencies about the appearance of new infections, allowing countries to take precautions, which is the role of America's AMR monitoring system set up in 1996 and the WHO's Geneva hub (CDC 2013) . What is also needed is low-tech advice, such as better education to reduce the spread of infections by thorough handwashing and sterilization on surfaces near infected patients. Also there is need for more radical cleaning methods in hospitals, such as the vaporizing technique that sterilizes everything in hospital wards (Zoutman et al. 2011 ). There are some signs of success in these efforts, for more careful procedures have reduced rates of a particularly troublesome bacterium called methicillin-resistant Staphylococcus aureus bacterium (MRSA) in Britain by 80 % since the 2008 peak, when increasing numbers were picking up infections in hospitals (Davies et al. 2013) . It has also been argued that there are other reasons for AMR. Too many antibiotics have been given out routinely, often for viral infections that they cannot solve, while many people do not take the full dose, meaning bacteria are not properly eradicated and those that resist multiply. In addition, broad spectrum drugs are often prescribed, designed to cover many diseases, making bacterial mutations more likely. More specific drugs aimed at particular diseases would help, but this requires better diagnostic tests. Also this broad spectrum approach and indiscriminate use of antibiotics are killing off some of the many millions of 'good' bacteria that exist in our bodies and are helpful for our health. So the WHO and governments at many levels are emphasizing the need to educate doctors and the public to reduce their routine drug use, especially for minor infections that do not need them, which decreases the opportunity for bacteria resistances to emerge. This over-use is especially a problem in countries where antibiotic products can be bought in shops without a prescription. It is not only a developing world issue, but one that exists in many Mediterranean countries, where resistances to five basic bacterial strains have grown, whereas the figure in the U.K. where the drugs need a doctor's prescription was only 0.4 % (WHO 2014b). The growth of illegally produced and often ineffective antibiotics is another challenge that needs to be rapidly contained. An even greater problem comes from the routine antibiotic doses given to farm animals, especially in the high density farming of chickens, pigs and cattle, without waiting for infections to emerge and isolating the infected cases. This greater exposure to the drugs again makes it more likely that resistant bacteria will emerge. Such farming practices are routine in North America but banned in Europe, resulting in Europe closing its markets to meat products from North America. To drastically reduce this drug use is going to be difficult, given the profits that pharmaceutical companies make from such practices. We may see similar problems to those experienced by attempts by governments to reduce tobacco advertising etc. which led to huge lawsuits by the tobacco firms. Action on AMR is occurring at last, but is long overdue, if what senior medical people and authorities describe as an emerging crisis is going to be solved. Most people in the developed world now spend most of their lives in indoor environments, for time-budget surveys have shown that the average American spends only an hour a day outside. This has led to increased attention being paid to the health problems created by these indoors environments (Godish 2001) . In the less developed world the continued use of charcoal or wood burning fires for cooking, with inadequate smoke extraction, leads to respiratory problems and far too many premature deaths, estimated at 4.3 million annually (WHO 2014a). In the developed world the hazards associated with the historic use of asbestos for insulation and lead in paint has become well known, and their poisonous effects have gradually been removed by eliminating them in new buildings, although older structures may still have these and other contaminants. The health and crime effects of lead in paint and from car emission fumes have already been described in the discussion on Safe Cities (Chap. 12). Although these and other poisonous materials are being identified and eradicated, a wider set of problems associated with indoor living have come from what has been described as 'the sick building syndrome'. Indeed, it was estimated in 1984 that up to 30 % of new office buildings were affected by the problem (EPA 1991) . A wide range of factors contribute to this syndrome, such as: poor ventilation causing polluted air to circulate; poorly sited intakes which bring in exhaust and other fumes; biological contaminants in water ducts and tanks; various toxic chemical emissions from insulation materials, plywood and artificial fibres, as well as from the over-use of poisonous cleaning chemicals; while dust containing materials brought from outside, as well as insects, spores, dirt and skin particles, exacerbates asthmatic conditions, itself a disease that is rapidly increasing. Fortunately, the smoking bans in workplaces, restaurants and public places has helped reduced the problem. It must be acknowledged that it was often cities that began the process of making such practices illegal and subject to fines. Another problem comes from the fact that living mainly indoors in a climate-controlled environment means that the body does not have to consume so many calories to keep warm. So people with the same eating habits are more likely to become overweight by spending too much time indoors. The increase in indoor living has also reduced exposure to sunlight and its ultraviolet radiation which creates health-giving vitamin D in the body. Without it there are sunlight deficiencies. In the densely populated slums of nineteenth century cities it led to high incidences of rickets in children-a disease of bone deformation. Traditional societies in northern climes were protected because their diet included high levels of vitamin D from oily fish. Fortunately the disease was largely eradicated in developed countries in the twentieth century by lower density layouts, and mandatory green spaces in new housing areas, as well as by dietary supplements containing the essential vitamin D, such as by the provision of free milk and the cod liver oil supplements given to children in Britain during World War II (Gille 2004) . Sadly, new cases of rickets have emerged in children occupying crowded houses in parts of London (Michie 2013) . A reduction in vitamin D can also lead to a depressive illness known as Seasonal Affective Disorder (SAD). Originally seen in countries with long winters and cloudy skies, such as in the deep, fiordic coastal areas of Norway in particular, it has become increasingly prevalent in those who do not venture outdoors. However, this can be cured by increasing the amount of time spent outside in sunlight, or through light therapy from intense light pads (Lurie et al. 2006 ) and adding vitamin D to diets. Another worrying medical trend in recent years has been the explosion in the incidence of autoimmune diseases (AI), such as Type 1 Diabetes, Multiple Sclerosis, Schizophrenia, Rheumatoid Arthritis, Crohn's disease, Coeliac disease (gluten intolerance), several forms of Cancer, and, as some researchers suggest, Autism Yang et al. 2013 ). These diseases are given the AI label as they are caused by the body attacking its own tissues, with some researchers suggesting they currently affect 5-10 % of the world population, proportions that seem certain to increase, even if they are not yet in the highest causes of mortality and ill-health (Ramagopalan et al. 2009 (Ramagopalan et al. , 2010 . The exact causes are still unknown, but it has been recently found that these diseases have higher rates in the developed lands of higher latitudes, as well as in urban, compared to rural societies, and with seasonal peaks in late winter and spring (DiSanto et al. 2012 ). It has also been suggested that these associations seem to account for the higher levels of diseases such as MS in women in cloudy areas such as the Orkneys, in increases in black-skinned individuals who are second generation immigrants in inner city London, as well as in women in Iran since the Islamic revolution forced many women to wear all-embracing veils (Gille 2004; Michie 2013) . Researchers are suggesting that the environmental effect of low sunlight exposure in both northern and, it must be stressed, urban places, results in vitamin D deficiencies, which may also be at least a partial cause of the explosion in the incidence of many of these non-skeletal problems. The recent increases in asthma rates also seems to have similar causes. Litonjua and Weiss (2007) have calculated that asthma now affects 300 million people worldwide and has shown a 50 % increase since the 1970s, which also seems to be the result of a deficiency of vitamin D, in this case one that impairs lung development in the foetus . More recent research has also discovered that natural levels of vitamin D are also low in mothers with children born in late winter-early spring. This is due to a lack of exposure to the vitamin D-causing properties of sunlight, again linked to more indoor living. Research reviews have implicated over 18 of these autoimmune diseases (AI) with low vitamin D levels in pregnancy so that deficits in this vitamin go beyond its role in bone metabolism that leads to rickets (Yang et al. 2013) . Although the precise causes of the relationships are still under investigation, the low vitamin D effect may be exacerbated by genetic risk and perhaps mothers' diets Litonjua and Weiss 2011) . Researchers have suggested that it is probable that the low vitamin levels may make a pregnant mother's body create antibodies that cause particular genes to malfunction as well as damaging the brain of the foetus. These antibodies are not flushed away before birth, but remain in the body and cause certain genes in some people to malfunction in later years, resulting in the body being attacked by its own immune system. This problem in early gestation means that adding vitamin D later in life does not solve the problem. It has also been argued that this vitamin also seems to be essential in providing adequate amounts of serotonin in the brain, which have been shown to be low in autistic children who have low empathy levels, which provides an explanation for the higher incidence of the disease with males and those born in late winter (Patrick and Ames 2014) . Although AI research is still in its infancy, recommendations to combat their increase are emerging. One is the recommendation that pregnant women eat more natural sources of vitamin D in fruits and vegetables and take supplements of the vitamin. This research also suggests that the accepted wisdom since the late 1960s of reducing exposure to sunlight because of the risk of skin cancer needs to be modified, for moderate exposure amounts is needed to reduce the subsequent increase of susceptibility to these AI diseases. It seems likely that a gradual decrease in exposure to sun due to sun avoidance behaviours in Western societies (sunscreen, clothing, sun avoidance, increased time spent indoors) reached a critical level in the early 1970s, such that humans were not spending enough time outdoors and vitamin D levels reached acutely low levels. Vitamin D is essential to the normal functioning of the human immune system. (Litonjua and Weiss 2007, p. 747) All these suggestions seem especially applicable to the poorest population, often in overcrowded parts of cities with few park spaces. Although attempts have been made to ensure poor mothers and their children are provided with vouchers for fruit, vegetables and free vitamins, such as in Britain's Healthy Start programme for children at risk, it has been reported that less than 10 % used this opportunity-a depressing and unnecessary result attributed to gaps in supply chains, delivery systems and poor staff training (Michie 2013) . All these emerging problems may be amenable to further breakthroughs in medical research and the greater spread of effective care. But as the cases of the failure of some antibiotics and the aging population in particular indicate, there are huge problems developing that will need a great deal of research work to combat. Moreover, if infectious diseases cannot be controlled by antibiotics, the historic danger of diseases spreading by contagion in large urban areas will re-emerge. This risk is reinforced by the new fast transports of air and train, including cramped subway systems, which crowd people into what are effectively sealed containers and deliver them to far flung destinations where they quickly and anonymously disperse into the larger settlements and may infect populations not used to the diseases they may carry. Also, other emerging health problems, such as AI diseases, seem to have an association with our increased urban living. These negative effects of urban living mean there should be a major emphasis on providing healthier urban behaviours and environments. The so-called 'health field' concept promoted by a former Canadian Minister of National Health and Welfare (Lalonde 1974) is often regarded as an important marker of a renewed emphasis upon public health after its dominance faded from the 1880s once urban environmental conditions improved. The health field approach moved away from an emphasis upon medical care alone, by stressing the need to investigate all four key categories of the factors that affect health: human biology (all factors related to the body, including genetics); environment (all features outside the body that affect health, but over which individuals have little control); life-style (personal decisions made by individuals that affect health, such as drug-taking etc.); and provision of heath care facilities (their quality, availability and accessibility of health care). Other studies have provided alternative versions of the main categories of what are now known as health determinants, sometimes in diagrammatic form (Dahlgren and Whitehead 2006) . These summary features have been extended by more recent research that has shown the need to look more closely at a subset of these determinants that are often overlooked, namely how various characteristics of the social realm-such as the effect of the socio-economic inequalities, or from stress or social exclusion-influence health outcomes. These factors are part of what are being described as the social determinants of health (Wilkinson and Marmot 1999, 2003) , although the term has sometimes been used to categorize most determinants under the 'social' title (LCC 2012, p. 6) . These summaries of the various health determinants provide useful initial descriptions of many of the relevant groups of factors that influence ill-health. But it does seem worthwhile to extend and re-arrange the ideas to provide a more comprehensive view. In Fig. 13 .1 eight broad domains of factors that influence health are identified together with the surroundings contexts. Within these health determinant categories, examples of the more specific factors can be identified which have deleterious effects on health. Many of these were not routinely dealt with by current health care systems, but are now being more closely investigated by epidemiologists and other researchers. These domains may be initially considered as separate categories of related factors for the sake of clarification; in reality many often combine with one another to create ill-health, especially in our complex, multi-faceted urban places. Also, many of the factors have two-way relationships with one another, including modifications on human biology through epigenetic processes linked to life-styles or environments that affect the operation of genes. Moreover, some of the features are directly linked to disease; others take more time to cause ill-health. But the influence of all these factors cannot be discounted when evaluating health in an area, since they provide the background conditions that lead the body to be weakened and prone to specific diseases. Although these factors can be looked at individually, in practice they combine in a spatial context to create differential place characteristics, not simply in physical or built-environment terms but also through the economic and social environmental characteristics and life-style choices. This ensures that each location will have very different risks and protections for health outcomes. So a continuing challenge for a research field such as epidemiology is to get to the root causes of ill-health and premature mortality in various locations and to provide new preventative health policies that relate to all the health determinants. Unfortunately, as the authors of a text on Unhealthy Cities have argued, the twentieth century approach to public health has until recently ignored or at least underestimated many of the spatial variations in these determinants; …In part because public health has been burdened with a tradition that overemphasizes individual-risk factors, the consequences of social and environmental conditions for health promotion and illness have been overlooked. (Fitzpatrick and La Gory 2011, p. 155) So even though the twentieth century focus on the biomedical model has ensured major advances in the level of health and continues to do so, hopefully by solving the problem of ABR and the new diseases, there is also a pressing need to reduce the negative place-based factors that contribute to ill-health by investigating the impact of other health determinants. These are the first three health determinants shown in Fig. 13 .1. The most fundamental influence on the health of persons or populations stem from their biological constitution, their ability to ward-off and recover from disease and disabilities. In this context features such as age, gender, life-history and the genetic make-up of people are all important. Age is really a matter of being more prone to certain problems as the body ages and experiences what amounts to wear and tear, or in the early years when a baby may not have developed the strength or immunities to fight off the dangers in the local environment. Gender is also important, as males or females may be more liable to suffer more from one type of disease or disability than others. The latter certainly live longer on average than the former, although some of the differences are due to greater risk-taking in males and their more unhealthy life-style choices. Some families may have flaws in their genetic make-up that make them more prone to certain diseases. In addition, some groups have a greater ability than others to resist certain diseases, not because of inherent racial features, but through centuries of developed immunity to problems caused by some pathogens. Others do not have these resistances. The tragic result of these differences not being understood was seen in the European Age of Exploration when the indigenous peoples of the Americas died in their tens of millions from diseases introduced by Europeans. Table 13 .1 separates medical knowledge from health care since high levels of knowledge does not mean there is 'cure and care' for all individuals in an area, especially while there are many different forms of care and differential access to it. Continued research and effective and equitable application in all of these determinants are obviously needed, especially given the need to counteract the severe challenges identified above that come from emerging medical problems. Nevertheless, there are still many spatial inequalities in the provision, availability and effectiveness of medical care, not simply within and between cities, but in urban centres in the developing world that have limited facilities and where only the affluent get adequate care. In addition there are increasing worries about accelerating health care costs, with ever more technical procedures and longer treatments for the new chronic diseases. Some indication of the escalation of costs can be seen by the fact that the budget of the National Health Service in the U.K., which provides universal coverage for all citizens, was originally close to $15 billion (£9 billion equivalent) when adjusted for inflation to provide current value in 1948 when the programme began. But the budget is now is now more than 12 times this original amount and is a growing at 4 % a year, with increasing needs from an aging population. Although more careful administration and costs savings in hospitals or in care facilities may reduce the figure, the size of the budget and its growth make it unlikely that they will do more than modify the total, meaning that new ways of funding the system, perhaps by larger tax increases will be required. These costs are more difficult to absorb in the less developed world, where access and availability to care is limited. Indeed globally, health care availability is inversely related to need (Howatt et al. 2012) . This is leading to the development of so-called frugal technologies to solve medical problems at lower costs, using less sophisticated techniques, in order to increase availability in low income areas. Another problem that has emerged is the corruption and fraud in some countries. For example the U.S. system is largely a private one, although there are government agencies that provide care and subsidies for the aged (Medicare) and the poor (Medicade). Recent reports suggest fraud could be as high as $ 272 billion annually, which is around 10 % of U.S. medical spending (TE 2014). The health care fraud comes from many sources that include: suppliers charging Medicade for non-existent supplies; patients claiming benefits to which they are not entitled; doctors or health clinics charging for extra or nonessential services, and filing prescriptions for painkillers and then selling them; people stealing medical cards and using them to bill Medicare for services that they are not entitled to. Steps are being taken to reduce this enormous leakage of funds, including simplifying the claims systems and providing more vigorous checking. This problem of wasted money may be more obvious in the complicated U.S. system but certainly exists in other jurisdictions and must be reduced so that health care money can be spent more wisely and fairly. The fourth determinant contributing to the health or ill-health of populations come from the life-style choices that people make, although these effects are often cumulative and take place over decades. The strong adherence to the principle that a population in democratic countries should be able to make their own choices reduces the ability to create quick solutions, especially of a coercive nature, so multipronged approaches are usually needed. It has already been noted that the population in the developed world in particular spends too much time indoors. There is a need to redesign areas and to provide leisure spaces where more outdoor activity is encouraged. However, the most obvious pernicious effects of life-style choices come from the increase in the use of addictive substances that drastically reduce the life-span of people using such products. For example, despite the now well-known health hazards of smoking, an estimated 1.2 billion people still use tobacco regularly and the proportions of smokers in even OECD countries is often still over 20 %. This also puts the individuals adjacent to smokers at a high risk of smoke inhalation, either first or second hand, and leads to higher rates of respiratory problems and cancers. Similarly, drug use leads to premature deaths and other crime problems. But a more general problem comes from the overuse of alcohol, which leads to liver and heart diseases, as well as contributions to traffic accidents. A recent study revealed that 25 % of Russian males die before the age of 55-compared to 7 % in the U. K.-with an average age of death for males at 64, a result of overconsumption of vodka (Zaridze and peto et al. 2014) . However there is also evidence that beer and wine in small proportions has a positive effect on health (Preedy 2011) and helped reduce mortality rates in historic cities where beer with alcohol killing bacteria was safer than contaminated water. All these problems require determined efforts to produce a reduction of addiction levels. In the case of tobacco use many initiatives have helped reduce the problem, such as: increasing taxes and banning advertising on products, adding health warnings, increasing educational programmes to avoid or reduce use, and banning their use in public spaces-but the constant litigation by tobacco companies has reduced progress. Still more effective policies are needed, which probably need a combination of persuasion and regulation. Outright prohibition of products drives the issue underground, and criminalizes the behaviour, as shown by the U.S. prohibition of alcohol in 1920-1933. Many of the anti-tobacco policies could be applied to reduce binge drinking. The same applies to drug use, where despite billions spent in attacking the sources of supply and distribution the problem is as worse as ever. Some have argued that removing criminal prosecutions and treating drug use as a medical, not criminal, problem would help reduce the addictions. But the majority opinion is still against such a change, which means that alternative methods must be found. A more recent health problem associated with life-style relates to dietary choices. Changes in the diet of people in many western countries, with overconsumption of sugary drinks and foods rich in fat, red meat, sugar and salt has led to increased obesity, leading to a greater probability of ill-health from a number of diseases. For example, it is currently estimated that one-third of school age children in the U.S.A and two-thirds of American adults are overweight or obese, with far higher rates in Afro-American and Hispanic women (Ogden et al. 2014) . Being overweight or obese increases the risk from a variety of chronic diseases, especially heart failure, stroke and diabetes which are among the top causes of mortality with numbers projected to increase substantially in the next 15 years (Table 13 .1). This trend is not simply a result of increasing purchasing power-although this makes a huge contributionbut also comes from the relentless advertising by manufacturers to increase demand for their products, especially to children, creating what amounts to an obesogenic environment. In addition far larger portions of most meals and drinks are available compared to 30 years ago, which compounds the problem. Another contributing factor is the limited availability of healthy food options in low-income areas. There is little doubt that national legislations are being enacted to address this growing problem. For example, almost a third of the population in Mexico were reported as being overweight, 14 % have diabetes and the average daily consumption of sugary drinks was half a litre. This led to Mexico's 2013 decision to increase by 10 % the tax on sweetened beverages, using major advertising campaigns to support the policy. In other countries, such as Britain, attempts are being made to create Agreement Partnerships with large retailers to reduce the number of sale prices on products with large portions of fat and sugar, as well as on alcohol products. Many also believe it is important to reduce the level of advertising on such products, especially those aimed at children. The mandatory labelling of calorie levels in tinned or packaged products is also taking place, in the hope that the public will become more aware and make more informed choices about what, and how much, they are eating. Yet only a multifaceted strategy will solve this increasing crisis as summarized in a WHO report (2000) on Obesity: Preventing and managing the global epidemic. Unfortunately, limited progress has been achieved since its publication. However, some cities are taking the lead in public health actions such as those relating to diet, before national regulations are developed. For example, Alcon (2012) has argued that Mayor Bloomberg's leadership in promoting healthy initiatives in New York provided one of the first examples of a municipal politician taking controversial stances in this field. This led the Health Commissioner, Thomas Farley, to note that: …For way too long public health departments have defined their responsibility as essentially infectious disease control, rather than the improvement of health of the population. (Alcon 2012 (Alcon , p. 2038 The New York approach has been to implement policies to reduce levels of addiction and improve diets, such as by the banning of smoking in public places and businesses, plastering anti-smoking advertisements on subways, banning transfats in restaurants, and an attempted ban on large sugary drink containers that was struck down by the courts. In addition there has been greater targeting of crime problem areas, for most have high levels of ill-health, as seen in the crime measures described in the discussion on Safe Cities (Chap. 12), and in creating district public health offices in areas of low health, such as one in the Bronx. In other countries more and more planning decisions are being made to improve the walkability of subdivisions and adding segregated bicycle paths to increase this form of transport, as described in Chaps. 4 and 6. Such policies are designed to increase exercise levels, which in turn improves health. But it is overeating the wrong types of food that is the major problem in most areas. In addition, the trend that has reduced play time, sports and exercise periods in schools is being reversed in many cities in an attempt to increase the fitness and therefore health of children, while the increasing popularity of physical health clubs for adults shows that a start has been made on the problem by at least some people. Although it is difficult to quantify the effect of exercise, recent surveys indicated that 9 % of premature mortality in noncommunicative diseases is due to low levels of activity, although this varied with the type of disease or illness (Lee et al. 2012) , while other researchers argued that participation in sport is associated with a 20-40 % reduction in all causes of mortality (Khan et al. 2012 ). Behind these first four domains of health factors are four categories of environmental determinants, related to the natural, built, economic and social environments. Although the causes of disability and premature mortality often relate to one or more of these factors, a WHO survey of the environmental effect on ill-health (Prüss-Üstün and Corvelán 2006) estimated that 24 % of the global disease burden (health years lost) and 23 % of the annual deaths are due to what are described as 'modifiable environmental factors', proportions that rise to closer to a third for children. These are described as the factors in the physical and built-environments that can be altered; they exclude factors such as disease vectors in water bodies, impregnated bed nets, as well as social behaviours or life-styles. The WHO report showed that the burden of environmental effects on premature mortality and ill-health is far higher in the developing countries and varies with disease type. Studies of the risk factors of different diseases show that 85 out of the 102 types covered had an environmental factor, although these varied considerably. The highest mortality proportions were seen with diarrhoea (94 %) and malaria (42 %), far higher than the 20 % from lower respiratory causes, although this rises to over 40 % in developing areas, primarily due to fumes from the use of solid fuel fires in enclosed spaces. The risk factor may be greater than previously thought, as recent work is showing that environmental effects can be passed through generations through epigenetic processes whereby genes are turned on or off (Tollefsbol 2014) . The first of these environmental influences are those from the physical conditions, features often forgotten when the focus of attention is upon cities in the developed mid-latitude world that have less extreme environments. For example, health is often affected by extreme deviations from the normal climatic regimes, such as in cases of extreme cold or heat and high humidity, where the elderly, young and those in ill-health, die in larger numbers in such weather conditions. Some of these problems are made worse because of the heat-island effect of all the concrete, steel and bricks in cities. For example, in July 1995 a Chicago heat wave led to temperatures reaching over 40 °C-although the effective temperatures as felt by the body in the sun and in buildings were far higher-resulting in the deaths of over 600 people (Klinenburg 2002) . The size of the mortality from this event in one city is worth comparing with other statistics. In the decade after 1992, 2190 Americans died from heat-related causes compared to 880 from floods and 150 from hurricanes. Although this shows that extreme heat can result in spikes of mortality, it is important not to underestimate the effect of disasters from natural hazards when many thousands, even millions, of people have been killed by major storms, earthquakes, volcanic eruptions as described in Chap. 8. The second health effect of the physical environmental comes from various biotic hazards, ranging from snakes to toxic plants, although these are often less of a problem in cities as they are often deliberately eradicated. Of greater significance are the diseases spread by contagion in a densely populated area by pathogens developed in, or carried by, various animals or insects. Perhaps the best known historic examples come from the various plagues, such as the Black Death plague that spread from Asia via trade routes and peaked in 1348-1350 in Europe and led to the deaths of 75-200 million people. This was between 30 and 60 % of Europe's population-although it must be noted that the century also saw the end of the medieval warming period and poor harvests which led to food shortages and a severely weakened the population (Herlithy 1997) . More modern examples range from the previous descriptions of the rapid global spread of HIV-AIDS, which contrasts with the containment of SARS. The mortality rates and ill-health effects of many tropical diseases are still far too high. Many, not all, now have cures, but these are not often available in many cities and regions. One of the biggest health problems is still associated with malaria, spread by parasites carried by certain types of female mosquitoes in tropical and subtropical areas. Although not in the top 20 causes of death in Table 13 .1 it has been argued that the 0.66 million deaths estimated by the WHO represent gross underestimates, with the true figure being closer to 1.24 million, let alone the debilitating effects on many people (Murray et al. 2012) . Certainly medicines such as quinine, several recent drugs, and the use of nets over beds at dusk and night when mosquitoes are flying have reduced the risk of the disease. So has the draining of marshes and standing water pools in urban areas which reduce the number of places where the insects breed. Snowden's (2006) detailed study of the effects of malaria in Italy until it was finally eradicated in 1962 showed how the disease had previously led to low productivity, premature death and economic backwardness in the country. It needed a multi-pronged state strategy from the early twentieth century to remove the scourge, one that involved medical breakthroughs, swamp draining, rural clinics and education for preventative measures. In tropical areas the hoped-for final solution by chemical spraying of marshes with DDT in the 1960s proved illusory, as the parasites have bred resistance to particular drugs and the sprays used led to the poisoning of the water which created other problems for a whole range of birds and animals. So despite determined efforts to reduce its effects, through new drugs, swamp draining and more recently through experiments with the mass inoculation of populations in some areas to reduce the number of people acting as hosts to the parasite, malaria still remains a significant cause of ill-health and premature deaths. This may get worse as some of the malaria-giving parasites are becoming resistant to the drugs used to treat them, which makes new treatment discoveries a priority. These are only a few of the examples of death and disability caused by factors from the physical environment, especially from the hotter wetter environments where a large number of other serious diseases lurk, such as dengue, yellow fever, typhoid etc. Some people in these areas have increased tolerance to these infections, but many of these diseases have an increased ability to harm people who have travelled from other environments without the appropriate medical treatment. One of the big emerging problems is the effect of global warming. This will probably lead to lower yields from current crop varieties in an aggregate sense, leading to food shortages given world population growth, unless new varieties can be developed. In addition, the warming will allow many flies and insects, and hence the pathogens they carry, to spread further north, infecting people with little immunity. So malaria may be spread again to Europe, as West Nile disease is already becoming common in North America. The fact that diseases, such as African dengue and chikungunya fever, have recently spread to the Caribbean, probably in the blood of an infected airline passenger and then into the local mosquito population, means that these and other infectious diseases are likely to occur in more northerly locations, causing additional health problems. The second set of environmental factors that affect health come from poor manmade or built environments, issues identified as causing high mortalities that led to the first healthy city movement. Progress in built-environment improvements, including the infrastructures such as transport systems, to reduce their negative outputs has not been halted, at least in the developed world. For example, many public sector high-rises have been razed because of high crime rates and poor construction (Coleman 1985) . In terms of harmful outputs from cities, Chap. 6 has described the major improvements in the technical ability to more safely process and use the effluent in sewage processing plants, while water can be re-cycled for human use. In addition, instead of new expensive solutions to bring water from long distances, a series of new devices for reducing run-off and using this water mean that new water conservation measures are being used in some jurisdictions that will help water shortage in drier climates. However, these nineteenth century advances, let alone those of recent years, are still limited in the cities of the less developed world, for the WHO (2012b) estimates that 1.1 billion people still lack access to clean water and 2.6 billion do not have even a basic latrine, although have been major improvements in these basic sanitation and water supply facilities in the last decade. Many of these people live in rural areas. But very unhealthy conditions linked to poor water supply and limited sanitation also exist in many urban places in developing countries, especially in the informal shanty towns that are the homes to many in the towns and cities of developing countries. One result of poor sanitation is the annual death toll of 1.8 million from diarrhoea, currently fifth in the WHO table of annual deaths. Although it is estimated to fall to ninth position, it may still claim 1.6 million deaths a year in 2030, even though it is relatively easily treatable by modern medical techniques. Health problems in cities are not simply a result of poorly constructed buildings and infrastructures; there are often health problems associated with the use of manufactured substances whose poisonous side-effects are often underestimated. Some of these were identified in Chap. 12 (Safe Cities), such as the lead in gasoline that produced more toxic car exhaust emissions that have been shown to affect brain development and subsequently I.Q levels and increased crime rates (Reyes 2007; Markus et al. 2010 ). More apparent are the deaths and disabilities from road accidents, which are among the most important causes of premature deaths (Davies 1997) . The WHO's Global Study on Road Safety (WHO 2013a) showed that 1.4 million people die on the roads annually, projected to increase to 1.8 million by 2030, with somewhere between 20 and 50 million more suffering serious injuries today (Table 13 .1). Certainly there are signs of improvement in the mortality rates in countries such as the U.S.A., with deaths now down to 34,000 a year in 2012, from levels of 54,000 in the early 1970s, which is still a rate of 1.08 deaths per 1000 people a year and the size of a small country town. Even this figure is unacceptable. In any case there is little room for complacency in world terms. After all, 91 % of the fatalities are now in low and middle income countries, places that contain only half of the world vehicles so far. This high proportion is because only 28 countries (with 7 % of the world population) have compulsory road safety standards in five critical areas: drink and driving regulations, speeding, failure to have seat belts, child-retaining devices, and the enforced use of helmets for motorcyclists and cyclists. In many countries mortality rates could be reduced if the following problems were solved: lack of driving tests; safe crossing places and pavements for pedestrians; refusal to obey traffic signals and adherence to the designation of one side of the road for traffic in one direction; erratic driving behaviour; and limited effective ambulance services that can quickly administer care and remove traffic victims to hospitals. Although mortality and injury rates have dropped in most of the developed world there is still room for progress in their cities, such as better designed roads and junctions, especially at high traffic accident areas, effective laws to reduce distracted driving, segregated bicycle lanes, more public transit, and, as some cities are showing, saving lives by reducing speed limits in residential areas to 30 km/h or below. Improved transport systems and the use of bicycles also provide more healthy options, with the latter in particular providing exercise. The man-made urban environments also affect health through the higher noise levels and the stress that many people feel in these areas, often combined with an absence of meaningful supportive relationships, issues usually dealt with under the social determinants of heath category. More specifically, health is also affected by all the pollutants generated by human numbers and their activities, from faecal matter to noxious products of industrial process and in household goods, poisonous chemicals and nuclear outputs in some places, and to the fumes and particulates from burning fossil fuels (EPA 1999). These not only cause local health problems, but by altering the atmosphere are threatening to alter the climate of the planet. Many of these have already been described in the sustainable development discussion (Chap. 5). A WHO report (2014a) on pollution levels in 1600 cities of the world maintains that air pollution-both indoor due to stoves and fires and outdoor because of vehicle and industrial emissions from fossil fuel-leads to over 7 million deaths annually from these sources. Although the Indian government disputes the figures, the source shows that Delhi now has worse air quality than Beijing, as measured in small airborne particles that are less than 2.5 micrometers in size (PM25) which are associated with increased rates of bronchitis, lung cancer and heart disease. However, not all aspects of a created environment are damaging. The addition of parks, green spaces, recreation trails and nature areas etc., have long been regarded as having a positive effects on humans as the discussion on the relationship between green space and health in the chapter on Green Cities has shown (Chap. 4). Such areas provide an antidote, in at least some cities, to the increase in urban densities and indoor living. Many investigators, from the nineteenth century social commentators to the present (Adler and Ostrove 1999) , have shown that people living in disadvantaged socioeconomic circumstances suffer more ill-health and earlier death rates than others. However the adoption of the 'care and cure' medical approach has tended to blame some health problems on life-style choices, rather than the wider structural circumstances of their life in the socio-economic environment that negatively affect their health. These components of disadvantage are now usually labelled as the social determinants of health, with Wilkinson and Marmot (1999, 2003) using empirical research results to identify ten significant factors in causing ill-health, summarized as: the social gradient, unemployment, work conditions, food, stress, early life, social exclusion, social support, addictions and transport. But it does seem useful to separate the factors into what amount to economic and social factors, rather than having one long list, while the addictions and transport factors have already been dealt with in the previous discussions on life-style and man-made environment respectively. One of the most powerful explanations of ill-health among the economic determinants is described by Wilkinson and Marmot (1999) as the 'social gradient'. This means that people who are in the lowest income brackets have shorter lives than people in higher income brackets, as seen in the Canadian example in Fig. 13 .2, features that also apply to more ill-health. The pattern is much more pronounced in less developed countries. Less money for food, heating, adequate clothing, ability to pay for and access to medical care, knowledge of how to look after oneself, are all contributory factors, but the lack of education and hence ability to improve one's position in the social hierarchy is also a major contributing factor to this general trend. Populations that have higher levels of educational and occupational qualifications tend to have lower levels of dementia, indicating that there is some kind of cognitive reserve that counteracts the effect of this neuro-degeneration (Prince and Acosti et al. 2012 ). However, it was previously noted that the modification of other life-style factors, such as physical inactivity, could drastically reduce the incidence of the disease (Norton et al. 2014) . Another powerful social determinant of ill-health is the unemployment rate, as is limited work satisfaction and difficult work conditions. In general, people at the lower end of the social gradient also have longer periods of poorer health in their later years due to various causes. Rates of various mental diseases are also higher in cities. The level of nutrition is also critical in improving health. The increasing numbers of food additives, such as excessive salt, sugar and food preservatives found in many products in supermarkets and fast food outlets, have led to higher incidences of heart diseases and obesity in particular. Certainly some of these issues are a result of poor life-style choices by disadvantaged people, but they often lack local sources of healthy food. A good diet, with a variety of vegetables and fruit, and adequate amounts of each, is essential to maintaining health, for malnutrition leads to ill-health and also vitamin deficiencies which can lead to diseases. There is still controversy over these issues, but a meta-analysis of research findings confirmed a linear relationship between stroke risk and low fruit and vegetable consumption, with latest research suggesting that well over six portions of fruit and vegetables a day are needed to obtain the necessary vitamins (Qu et al. 2014) . Nutritious food may not be available in disadvantaged areas as shown by the concept of food deserts, areas in poor inner city areas that are without adequate food outlets that supply healthy and fresh foods, a characteristic that was described in Chap. 4. Health education may improve the choices made, but this is rarely as effective as expected, unless it starts at an early age in school, but even then family food choices may reduce possible progress. The presence of large social gradients or economic inequality in many countries and within urban areas also means that many people also lack the economic power to improve their shelter, or clothing, which then affects their state of health. For most of the twentieth century better wages, welfare measures and social provision led to a decline in inequality levels in developed countries. But as Chap. 3 has shown, the past 20-30 years have seen levels of inequality in income and wealth rising in developed countries, while the explosive growth of urban places in the less developed world has led to larger areas of poor squatter settlements. Progress towards healthier cities requires more attention to the provision of Basic Needs in food, shelter, education, etc., and to improve the Capabilities of the disadvantaged population to better their own condition and-as addressed by Just City initiatives (Chap. 3)-giving more people a voice in decisions that affect their areas. Some of these needs can be addressed by municipal actions, for example: in shelter by subsidized housing or community based non-profit housing agencies; in nutrition, through organizations providing food banks for those in need; or through planning decisions that take the health impact of a displaced population into account. However, the most fundamental changes are probably still made by national or state policy through taxation, welfare, social provisions and ecological regulations to remove the structural disadvantage of those at the low end of the social gradient that are created by societal not personal factors. A series of interacting factors that may be described as the lived-experiences of people are usually called the social determinants of health. A key factor is the presence of strong supportive networks of family or friends, which are correlated with better health. The opposite applies to those who are subject to social isolation, which may be by choice, but more likely a result of the absence of friends or family, perhaps due to age, poverty, disability, poor self-esteem. Poorer health is also found in those suffering social exclusion, especially discrimination on ethnic or religious grounds. Indeed, loneliness and social isolation-which is different to depression-has been shown to increase the propensity to functional decline and ill-health, as well as increasing the chances of earlier mortality (Cacioppo and Patrick 2009; Perissinotto et al. 2012 ). In addition, there is the problem of stress, which comes from the workplace as well as in homes and neighbourhoods, and is now accepted as having a biological effect. Hormones associated with the chronic stress burden protect the body in the short run and promote adaptation (allostasis), but in the long run, the burden of chronic stress causes changes in the brain and body that can lead to disease (allostatic load and overload). Brain circuits are plastic and remodelled by stress to change the balance between anxiety, mood control, memory, and decision making. Such changes may have adaptive value in particular contexts, but their persistence and lack of reversibility can be maladaptive. (McEwen 2012, p. 1) These social determinants create a poorer quality of life, with lower self-esteem and well-being. But others exist. For example, it is also being recognised that a healthier life is more likely when there are high levels of positive nurturing, such as when children are given a more supportive start to life through good education, nutrition and a caring environment with limited exposure to stress or violent behaviour (Brooks-Gunn et al. 1993) . This means helping mothers as well as their children to develop in such environments. In later life, worries about safety, let alone actual physical violence, are also important contributory factors in ill-health. In addition, some beliefs and values, such as refusal to accept certain types of medical care because of religious beliefs/prejudices, are additional negative factors. The extent to which there are supportive facilities in an area, from quality schools, to churches and community associations or the provision of home-help, can also create more caring social environments in an area, creating the social capi-tal that provides support, although these may be considered as part of an area's infrastructure. Added to such problems from these essentially empirical characteristics, are those that come from the affective dimensions, which deal with attitudes and feelings. Areas of high crime with social and health disadvantages have at least 10 types of negative attitudinal characteristics that can negatively affect well-being. These were labelled as areas or 'terrains' of: social inadequacy; despair and limited goals; exclusion and discrimination; acceptance of decay and destruction; anxiety and fear; spontaneity of actions and emotions; indifference to others; low restraint and self-control; anti-social attitudes; and peer group (gang) allegiance (Davies 2004) . All these attitudes reduce the well-being and the health of people in areas where these attitudes are found, and needed to be countered if progress is to be made in improving the lives of people affected by such features. Many social commentators, from those of the nineteenth century to the present, have suggested that being poor and stressed seems to literally make one sick. In addition, the effects of poverty, stress, exposure to violence, poor nutrition and limited affection is also known to lead to far lower acquisitions of early language skills in children and hence subsequent intellectual progress. Also, Fonagy (2003) has argued that the usual socialization of young children 'out of aggression' that occurs in nurturing parenting is limited in disadvantaged areas, producing a rapid and irrational response to perceived threats. This type of primitive, unconscious reaction to stimuli based on previous experiences, bypassing any rational evaluation, has been identified by the cognitive psychologist Aaron Beck (1999) as the main basis of an inadequate control of emotions, which leads people to violence in subsequent life. Increasingly, it is recognized that these health and behavioural associations are not a matter of life-styles choices, although they may contribute to the situation. What amounts to a biological embedding of the effects of the early environment occurs through epigenetic processes that affect the influence of genes, while it was earlier observed (Sect. 13.2.2.8) that recent research is showing a strong relationships between vitamin D deficiency in pregnant mothers and increasing rates of autoimmune diseases in their offspring. Stimulated by the work of Professor Elizabeth Blackburn who was awarded a Nobel prize in 2009 for her work on the way that the social environment has embedded biological effects, the mechanisms of these connections are now being explored in more detail, as seen in a special issue of Proceedings of the American Academy of Sciences (Boyce et al. 2012) and in a book edited by Tollefsbol (2014) that reviews transgenerational epigenesis effects. One of the ways in which this environmentalembodiment relationship occurs is by the shortening of telomeres, which are rather like a cap on the end of chromosomes that protects the DNA during replication (Epel et al. 2004 ). This reduction is associated with higher levels of ill-health and predisposition to many diseases in later life. A recent pioneering study of 9 year old Afro-American children who had suffered chronic stress from their upbringing in poverty, unstable families and who were subject to harsh parenting and maternal depression, revealed a shortening of their telomeres compared to children from more affluent and nurturing backgrounds (Mitchell et al. 2014) . So these disadvantaged children are likely to experience what amounts to premature aging and morbidity through degraded physical functions that make them more prone to disease. But the effect was not uniform. This association between the social environment and telomere length (TL) was also moderated by the extent of genetic variation in the subjects within the neurotransmitter pathways of the serotonin and dopamine genes that control the extent of pleasure and excitement responses. So some of the subjects had a genetically influenced higher likelihood of sensitivity. It was shown that those with highest sensitivity levels were more affected by their environment, having the shortest TL when exposed to these stressful environments. But they had the longest TL when exposed to advantaged environments. The researchers suggest that this points to the presence of a genetic predisposition in the responses to the social environment. The children with high sensitivity levels get even worse off from the poor nurturing and environment, but have greater benefits from better conditions. Yet those with limited genetic sensitivity may not be so affected by the negative social environment, which may explain why some people survive the highly disadvantaged area and their resilience enables them to subsequently flourish. Yet the general effect is that the children have their biological structure damaged by the stressful social environment of their upbringing, leading to future health problems. These findings mean that children in these areas are not simply disadvantaged because of locally poor educational opportunities to improve their life. Their negative environment effectively ages them and makes them more liable to ill-health. This recent research provides justification for early childhood interventions providing effective nurturing, creating less stressful environments to reduce the probability of what amounts to physical damage in epigenetic processes. The research has moved the known correlations between childhood disadvantage and future ill-health to causal relationships, although not all will be affected to the same extent. A final domain of influential factors affecting health is more wide-ranging in the sense that it shows the cultural context-health connection, illustrating how the culture of societies-which includes political power and relationships-affects all of the other determinants. Figure 13 .1 shows that cultural factors can impede or improve each of the various health determinant categories. For example, there have been many cases where a religious or sub-cultural group in countries or cities have refused to believe in particular medical treatments, while new medical advances have also been resisted by some practitioners. The adoption of malaria nets over beds in East Africa is a good example of the former. The uptake of nets supplied by international agencies in the 1990s was extremely slow at first despite overwhelming evidence showing their effectiveness. The white nets, however, were deemed too similar to the shrouds used to cover the dead, so they were not used. When the colour was changed from white to green they were quickly adopted by the local population. There is also a cultural-political background to the level of medical facilities. Governments may provide less or more money for medical care. In terms of life-style choices, regulations on access to addictive substances make it easier or harder to obtain these unhealthy substances. The impact of each factor varies according to the various circumstances in and between urban places. Indeed all these factors combine in various proportions to make significant place differences in the health of income groups and areas, especially in an urban context, between and within diverse towns and cities. The intra-urban differences often lead to an ecology of health disadvantage within cities, creating what has been called an urban health penalty (Greenberg 1991). These unhealthy areas are characterised by limited health care facilities, people with poor life-style choices, substandard residences and often polluted conditions, low incomes, few economic prospects and a stressful, unsafe and socially isolated life. These conditions are made worse by feelings of hopelessness and the negative biological embedding of the harsh social environment. Increasing concern about the persistence of these pockets of ill-health has been one of the reasons why a new approach to the health care structure of many cities was developed. Towards the end of the twentieth century, the challenges associated with improving the health of the population led many to the realization it was not appropriate to depend only on treating our way out of ill-health through the work of the medical profession alone. New forms of organizations and initiatives were also needed as well as a wider ecological view. It led the World Health Authority (WHO) to create the Healthy Cities programme in 1986 (Kenzer 1999) . To some extent this was a broader version of The Ottawa Charter of 1986 that had established criteria for making Canadian cities healthier. Although initially restricted to the developed world, the Healthy Cities programme spread to cities in all parts of the world from 1994 and by the end of 2013 comprises a network of over a thousand cities worldwide, with both continental organizations as well as national ones, such as the 29 cities that comprise the U.K. network. The most successful Healthy Cities programmes maintain momentum from five values: a clear vision to promote health; the commitment of local community members; the ownership of policies; a wide array of stakeholders; and a process for institutionalizing the programme (WHO 1998). It is left to adopters to organize their own structures on the basis that governance varies widely between countries and locals know their own needs and priorities. The risk of this autonomy is that some cities lack the resources to develop good guidance and policies. To counteract this, membership of this network allows cities to learn from one another and to use 'best practice' ideas pioneered by other centres. The influence of this programme has also led many non-members to review their own approaches to improving health care, in which six major new trends are prominent, namely: creating health impact assessments; new political engagements; raising public awareness; adding community engagement; improving private participation; and finally, a more specific spatial targeting to improve conditions in persistent areas of ill-health. One way in which local governments have become more engaged in health issues can be seen in initiatives that focus on the bureaucratic deficiencies of particular departments with respect to health issues. For example, planning decisions have rarely taken public health issues into account in reviewing and approving development applications, except in zoning decisions and insuring that such issues as roads, sewers, water and other utilities are provided. However, towards the end of the twentieth century new interests emerged that stressed the need to assess the health effects of public projects on the residents of cities in particular. In part this may be seen as an extension of the increasing use of Environmental Impact Assessments. These were designed to investigate the physical and biotic impacts of new developments and were given statutory approval in many countries as relevant issues to be considered in decisions made by government planning authorities. Initial support for the Health Impact Assessment (HIA) concept came from the WHO's concerns over the effectiveness of new sanitation projects in developing countries which led the organization to produce a review of procedures for analysing the effects of new development on health (Birley 1995) . A more general impetus came from the gradual realization that 'health' cannot be considered in isolation. In some regions it has become a vital consideration in all types of development, especially in terms of changes planned for the urban environment. This can be seen in Article 152 of the 1997 Amsterdam E.U Treaty which states: A high level of human health protection shall be enshrined in the definition and implementation of all Community policies and activities. In Europe the specific concept of the Health Impact Assessment (HIA) emerged from worries that public development proposals were not being analysed for health impacts (Scott-Samuel 1996) . For example, concerns about noise and other health impacts over Manchester airport's second runway proposal in the 1990s led to guidelines for health assessments, Soon after, the U.K. government committed itself to using HIAs as an important strategy in development reviews as part of the need to identifying and reducing health inequalities (Scott-Samuel 1998) . The so-called Gothenburg Consensus that emerged after a meeting of the WHO in the Swedish city in 1999 attempted to bring some order into the evaluation process by identifying key features that were essential in the development of a HIA (WHO 1999) . In addition to the general desire to maximize health outcomes they stressed the need for five values: democracy, so that all people can participate and help formulate policies; equity, to involve people from all walks of life; sustainable development in the long and short term; the ethical and transparent use of evidence, which recognizes that qualitative and non-scientific evidence can also be important in assessing impacts; and a comprehensive approach to health that considers the physical, mental and social well-being of all sectors of society and their involvement. Many of these values mirror the arguments used by proponents of a Just City approach to development within cities (Chap. 3) as well as those seeking to create greener, more sustainable places (Chaps. 4-6). It was also agreed in Gothenburg that a HIA should also include four major elements: reviews of the evidence that showed the effect of any new development policy or programme on the health of a population; consideration of the opinions, expectations and experience of those affected; recognition of the need for more informed understanding by both decisionmakers and the public of the effects of the proposed policies; and proposals for adjustment or other options to maximize the positive effects of any policy, and to minimize their negative consequences. These elements illustrate the seriousness of the desire to take public opinion into account and to consider views from the various stakeholders. A review of 88 HIAs from 1996 to 2004 (Davenport et al. 2005) revealed that not all were successful. It was concluded that only when certain conditions applied were they successful in influencing policy, namely: when there was institutional support for solving health problems; when key decision-makers were involved in the design of analyses; and when there was a statutory framework within government procedures for using a HIA. Without these conditions, various bureaucratic evaluation procedures just avoided the issues. Yet HIAs are now becoming as familiar as environmental reviews in some countries. A good example of the way in which health concerns raised by a community obtained crucial institutional support from a city's Public Health Department (PHD) can be seen in the Trinity Plaza proposal in inner city San Francisco in 2003, although it was not technically a HIA. The planning department supported the proposal for the redevelopment of the Trinity Plaza area near downtown in 2003, which involved plans to convert 360 rentcontrolled housing units into 1700 market-rate units in several high rise towers. It ignored the fact that the development was one of the latest gentrification projects that involved a major displacement of low income people who had lived in the area for decades (Corburn 2009 ). The project led to a great deal of community anger and a pressure group to oppose the proposals. The Mission Anti-Displacement Coalition (MAC) created what they called a People's Plan to retain affordable family housing, preserve local jobs and asked for community oversight over land use changes. Fortunately for the community's case the Public Health Department in the city had moved away from only dealing with its traditional biomedical and sanitation focus and started to address the social determinants of health in the city, setting up workshops to learn about community concerns and providing advice about ways of improving areas. This led the PHD to intervene in the Trinity Plaza project on the side of the community, supporting its view that the displacement of the existing residents would cause financial hardship, given higher rents in other locations, and ensure the loss of community relationships and stress to the individuals who would be evicted, leading to increased levels of ill-health from these social determinants. The gradual acceptance of the health consequences of this project eventually led to discussions between the PHD, developers, planners and community activists, which led to a revised proposal in which 12 % of the project would be allocated to below market-rent housing, with existing occupants keeping their homes at current rates. In Corburn's (2009, p. 130) succinct phrase: "the Department of Public Health had entered the world of urban development and staked-out a new space for human health". Although there are still problems about the continued gentrification of the Mission area, this San Francisco example provides a useful exemplar of the way in which health issues are being increasingly seen as relevant in planning reviews of new developments, irrespective of whether there is a formal HIA process in the cities. The need for greater political engagement is another major principle of the WHO's Healthy City movement. Hopefully this would lead to the allocation of more finance and resources to improve health promotion and care. This new emphasis on local political involvement in health care, instead of only medical decision-making, has often taken time to achieve in many cities, often with limited initial results. For example, Liverpool in England moved from setting up a Healthy City Committee in 1988, to a Healthy City Team (HCT) in 1994, which produced a new plan for Liverpool in co-operation with the local Primary Care Trust (Otgaar et al. 2011) . This Trust was one of the UK's 153 similar bodies under the National Health System that forms the first level of health care via family doctors and dentists, etc. Unlike the initial committee, the team had the skilled personnel to develop not only a vision statement, but a plan for achieving their goals, and consulted widely with citizens and medical professions and various other community groups. However, its plan was not simply about better health care and health promotion, but also related to how these would help to achieve greater business prosperity, social justice and environmental protection in the city. Despite its progress the HCT was wound up in 2000, in part because it seemed to need the greater involvement of the traditional medical professions. It was replaced by a new organization, essentially a new partnership, which is called Liverpool First, to plan and monitor health care in the city. This is now the joint responsibility of the Liverpool municipal council and the local medical Primary Care Trust. This organization has a wider range of stakeholders, ensuring input from many local organizations, from community groups, private businesses and various public sector services, thereby creating a more coordinated approach involving many people. This development in Liverpool has been duplicated elsewhere, which means that the planning of health care in Britain is no longer the domain of only the medical profession in those urban areas that have adopted this model. The local authority connection means that there is a more democratic basis, while the involvement of many other agencies and groups provides additional ideas and skills, to make a more comprehensive approach to improving the health care of residents in cities. New members of the Healthy City network have to create a Health Profile to show their situation in comparison with other centres. This creates greater public aware-ness of the level of health in particular cities and frequently leads those with poor profiles to take steps to find ways of improving their record. For example, one of the reasons why Liverpool joined the Healthy City programme was its recognition that it had one of the worse health records in the U.K. where life expectancy at birth was 74 years, compared to 79 in Edinburgh and 78 years in London. Only Glasgow with a life expectancy of 71 years had a worse record among major British cities. Also, over half of Liverpool's population lived in the bottom 10 % of deprived British areas, places that were classified by measures of multiple socio-economic deprivation, while a quarter of the population had long-term debilitating illnesses (LPCT 2007) . Part of these problems stemmed from Liverpool's decline since the 1930s, when its population was over 800,000, and the poor health habits of many in the population. This poor health profile was not only a problem for its people but also created a negative image, which had the effect of repelling rather than attracting innovative migrants or businesses. So health promotion and participation campaigns were initiated which involved the general public as well as a range of health care and social service professionals. In some ways this decision to attract public attention is another version of the comments made earlier in the chapter about the way that the Health of Towns Association in Britain in the mid nineteenth century encouraged discussion of the mortality crises in various towns that often led to major local improvements. The so-called Big Health Debate, which took place in Liverpool in 2006-2007, provides a good example of this process. It began by surveys to find major topics of health concern, followed by workshops involving medical professionals and representatives of the general population, then focus groups that worked on particular problems that had emerged, as well as analyses of the use of health facilities (LPCT 2007) . One of the findings was that the general public felt they lacked the power to take control over many parts of their life, especially in health care, and were critical of being unable to influence decisions that affected them. So there was clearly an unmet need to involve the public in the discussions and the decisions being taken over health care. For most of the twentieth century there were few examples of effective local community engagement in urban health planning under the older top-down approaches dominated by medical professionals. Usually the so-called 'community engagement' process involved people being informed about the decisions of medical planners after the main decisions had been taken, or were invited to attend public meetings which pretended to discuss issues of concern but were really designed to reduce resistance to changes. However the Healthy Cities goal of stressing the need for more democracy and community participation is leading to a different approach to previous practices. In the case of British cities such as Liverpool the new attitudes were encouraged by a national government white paper called Our Health, Our Care and Our Say (DH 2006) , which outlined: The need for a wider range of community-based services, offering patients choice, convenience, fairness and a much better experience of the NHS. The White Paper also focuses on prevention and the need for the NHS to work in partnership with local authorities, voluntary agencies and local people so we are all working to build the foundations for better health in local communities. (LPCT 2007, p. ii) The result was a model in which more community care was envisaged, given the fact that many are living longer, but often with debilitating illnesses that need local and home care, not hospital stays. Also a new type of participation was proposed to create more transparency and openness to discuss issues of public concern. It also anticipated capacity-building in the population, by improving their knowledge of the issues and encouraging their contribution to the debates, a process in which their opinions were carefully evaluated. This led to what was described in Liverpool as 'The Big Debates' about health care. They revealed that a major concern of many people was about their limited access to health care. This is a problem found in many cities, as decisions about the locations of both the offices of doctors or hospitals were usually made without effective public consultation. Indeed at a more general level it is remarkable how many hospitals in the developed world seem to be located close to wealthy areas, or have limited rapid public transport access since they are in inaccessible locations, while physician's offices are few and far between in poor areas, illustrating a failure of the market to serve all the population. Indeed few countries have adopted the Dutch model of planning which ensures that the major facilities that serve the public, or have large public use, such as hospitals, have to be located on major transit routes to maximize their accessibility. The health surveys in Liverpool showed that people were prepared to accept the emerging trend of the increasing concentration of specialized medical facilities so long as basic facilities were locally available within a short travel time. Research results in a major report showed the locations of various health facilities and mapped the travel distances and times to these facilities from various points in the city. Geographically, it demonstrated that many residential areas had either few health care facilities or limited access to them. Incorporation of these results into discussions on how to solve these problems of access led to a new and comprehensive approach for health care in the region, with the subtitle the 'Outside of Hospital Strategy', which has created a four tiered system of health care in the metropolitan area (LPCT 2007) . At the local level are the General Practices, with single or multiple doctors serving 1800-18,000 patients. Above these, 20-25 Neighbourhood Heath Centres are planned to serve populations of 20,000-25,000 and are located in places that could be reached by public transport in 15 min. These centres are anchored by doctors and health care professionals but also contain other social and community services. The third level consists of NHS Treatment Centres serving up 100-150,000 people that are located so they can be reached within 30 min by public transport. These provide primary and secondary care, such as minor surgeries, diagnostic tests, counselling and therapies, as well as outpatient clinics. The fourth level consists of the main hospitals serving 300-500,00 people that carry out major surgeries and other specialised medical procedures and care, in which some facilities concentrate on particular medical areas, such children's care or mental problems. This four-tiered system is designed to create a more integrated and community-based delivery system based on the problems of various areas, conforming to the goals of greater choice, access, fairness and convenience in a national white paper (DS 2006) . Also some of the centres provided a new set of jobs within areas that previously had few medical facilities. The second and third levels of Treatment Centres made access to public transport a major location principle, showing the need to cater for people without cars, while the addition of these new, more local units was designed reduce the growing pressure on the emergency wards of major hospitals. Liverpool also showed the way in which more effective public involvement in hospital re-planning can occur, as seen in the case of Alder Hey Children's Hospital in the West Darby suburb, which treats over 200,000 patients each year, many from other Primary Trusts outside Liverpool (Freeman 2006) . Additional new hospital buildings were built and the adjacent decaying and unsafe Springfield Park was rejuvenated-with walking trails, sports pitches and greater surveillance to improve safety-in addition to creating better connections for pedestrians and cyclists to surrounding houses and retail facilities to the south. The design and location of the new buildings, both their interior and exterior plans, were discussed in public meetings, with local citizens providing ideas and modifications of initial ideas, which led to a feeling of positive involvement with the site's redevelopment. Determined efforts were also made to ensure the project would be more sustainable, such as in building construction, energy use and creating links with local transport facilities. Moreover, instead of the hospital being simply a place of 'care and cure', the hospital trained its staff to help patients and their parents adopt a healthier diet and life-style, providing greater health promotion in an area where premature deaths and disability due to poor life-style choices is far too high (Otgaar et al. 2009, pp. 49-51) . These examples show how public engagement moved from the top-down, informationimparting style, to one of genuine attempts to find out the major problems that concerned patients and the public, but also to involve the local residents, as well as experts in various aspects of health care, design and accessibility, in a genuine interactive process of consultation and engagement. In a more social context there is also evidence of increasing interest in developing local community interactions. For example, in Helsinki the city developed a series of 'neighbourhood houses' staffed by a core of city coordinators, but also by many volunteers. They organize physical and social activities, primarily, but not exclusively, for young people to provide them with new opportunities, such as sports, and in doing so guide them away from anti-social activities (Lafond et al. 2003) . In some ways these 'houses', managed by the Health and Safe City Advisory Committee in the city, provide a contemporary version of older community associations found in many western Canadian cities that arose spontaneously (Davies and Townshend 1994) . However in these cases there was no anti-crime agenda as in Finland; most started with sports activities for children but added additional social functions for adults, or various youth clubs, and often generated funds to build their own community hall with help from city and provincial grants. Historically, of course, even older community-based social centres and youth clubs were created by churches, where parish halls provided places for local interaction and facilities. However, the decreasing numbers of church members and rival denominations in many cities has meant that their ability to act on behalf of the local area has been drastically reduced, although many still provide support for their church members or space for other groups. There can be little doubt that there is a need for more secular local organizations like the Helsinki approach, or the contemporary YMCA and YWCA that have evolved out of their Christian origin. All these local centres provide important places for social and recreational contact, not simply for the youth, but as centres that may also alleviate the loneliness of many old people in cities. Although the objective of these more comprehensive health initiatives has been to involve as many stakeholders as possible, the contribution of the private business sector in health has not been as widespread as it could be, although many of the schemes discussed above do have representatives from private businesses. Part of the problem is that some firms do provide extra medical benefits for their workers, and feel that this is enough. Others, especially in Europe, provide some employees, usually at managerial levels, with stays in spas, presumably to recover from the stress of decision-making!. Hence such firms are reluctant to also contribute to general health care schemes for which the benefits seem to be beyond their immediate interests. In some cities there are signs that private firms see the advantages in participating in schemes to promote better health, especially when there are new business opportunities. This is certainly the case in Helsinki where the promotion of new technology, a focus on health and new organizations to promote healthy living, has led to many initiatives (Otgaar et al. 2011) . For example, Forum Virium (FV) was established with support from major national organizations and companies to promote and create new digital services in several areas, including health care. This led to a photo and diary service for day-care centres to provide parents with images of their children's activities, a service that has subsequently spread to many countries, creating a new business venture. In addition, FV helped create the Healthy Helsinki programme that initiated several ventures to improve health through citizen empowerment, such as a Mobile Health project. Set up with help from telecom companies, it enabled participants to monitor their leisure and exercise levels, getting positive feedback through mobile devices if they increased their levels in attempts to change their behaviour. Elsewhere the development of computerized Personal Health Records, together with interactive mobile devices, has not only enabled users to store their own data but provide access to training, life-style and nutrition advice while in a neighbouring town it led to the monitoring of the growth in children, which identified abnormalities. More generally there are increasing numbers of smart applications for health-related issues being provided for use by smart or mobile phones, such as regular personal contacts to reduce isolation. Also, organizations such as the Toronto Rehabilitation Institute (TRI) are not only treating patients with long term disabilities from injury or aging, but actively pursue commercialization of their new technical products, such as their SlingSerter and RoboNurse for lifting patients (TRI). They are also in the forefront of research in providing assistance to people with cognitive impairment through smart alert systems to remind people to turn-off stoves and taps, and how and when to complete many regular routines, through sensing devices; in addition, like the Japanese, they are also building robots to help with these tasks. Such devices also provide help and relief to carers, but enable people to live longer in their homes rather than being forced into homes for the aged or hospitals, which dramatically increase costs of care. Although there are clear business opportunities for companies to provide the hardware and software for such new monitoring systems, there is always the danger of intruding on personal privacy. However, municipal bureaucrats are often wary of getting involved since they have their functions to perform and the public benefits of some programmes may not be easy to evaluate. Nevertheless, the new electronic technologies can be used to monitor health more effectively than in the past and more private company involvement is not only possible, but needed in Healthy City initiatives so long as personal privacy is not compromised. In the last decade of the twentieth century national governments of developed countries, often stimulated by the WHO Healthy Cities programme, started to emphasize policies that compel, not simply advise, all levels of government and their agencies to help reduce health inequalities, especially in cities. For example, in London it led to the creation of a London Health Commission composed of representatives from the various boroughs and organizations such as the NHS, Transport and the local Development Agency (LHC 2008) . This commission was designed to ensure that health concerns are involved in all the tasks of the Greater London Authority and the 33 boroughs within it that have statutory authority for a wide range of services and functions. Previously, local governments and the Primary Care Trusts of the NHS did not consider the health impacts of their various planning and development projects. This change meant there was now a statutory obligation for the mayor of London to act to reduce the inequalities. The Health Commission developed a strategy called Living Well in London that ensures that investments in housing and public spaces are made to create an environment that helps people make more healthy choices and promotes links between the many agencies-from housing, to police, social services and organizations dealing with drug and alcohol abuse and homelessness-that may help to improve the determinants of health. In some deprived areas new multi-functional facilities have been developed, such as The Hub in Canning Town, a large building that houses a health centre with doctors and nurses as well as a pharmacy, with many other services indirectly linked to health and community improvement, such as a training centre, an internet café, community meeting rooms, and a safety group etc. In other boroughs of London local strategic partnerships have been established to improve the quality of life and health of residents. The board of these organizations include representatives from the public sector, including the NHS, local government and police, and the private sector, composed of people from citizen groups and various businesses. Although they have no powers to create actions, they are influential in ensuring that health considerations are given a priority in the various plans and actions of the partner organizations. They also encourage the establishment of thematic groups to focus on specific health improvements, including encouraging local residents to identify and solve health related problems. In addition to these partnerships, a 'targeted site' approach has identified areas of extreme deprivation and health problems as places for special programmes to improve the local conditions. The Well London Alliance programme created 14 projects in 20 of the most deprived parts of the city in 2007, dealing with quite small areas of 1500-2000 people (WLA 2007; Otgaar et al. 2011 ). These are not policies of planning or housing departments alone, but involve multiple agencies-including the YMCA, two charities, a technology transfer agency, the Arts Council, a local university and major hospital trust-programmes which are funded by grants from the National Lottery. The various agencies contribute their own expertise to the projects. However, in an attempt to get more local input, the residents affected were consulted about the projects through site visits, action workshops and discussions in the community cafes that were established. Surveys revealed a lack of co-ordination among the existing social services and revealed that the people in the areas were not simply deprived economically, but lacked hope and ability to improve their conditions, typical problems that affect the development of grass roots organizations (Davies and Herbert 1993, Chap. 6 ). An important focus of these targeted Well London programmes has been to emphasize better nutrition, create exercise opportunities, and develop a series of other programmes to improve the negative various social determinants of health problems, through a series of socalled 'heart of the community' projects. These are designed to reduce loneliness and improve community relationships, such as involving the youth in the design of projects, creating access to healthy facilities in the area, from encouraging new shops, to finding safe walking routes. They also focus on reducing crime, developing personal improvement programmes to improve the mental well-being of people in the area and try to create a sense of ownership over local facilities and projects. By having control group areas where no intervention has occurred, the effectiveness of the various initiatives were measured in the 5 year project period from 2007 in order to create evidence-based policy evaluation, in the hope that the programmes that work can be copied successfully in other areas. However, despite these targeted initiatives there is a pressing need for more social housing in London. The influx of capital from overseas rich people, combined with the strength of the financial services sector in particular, means that land and house prices continue to rise, with the result that essential workers, such as nurses, social workers, police and teachers, are priced out of the market. The type of comprehensive, multi-agency approach described above is only one of the ways in which attempts are being made to improve the health of disadvantaged people. Another is the targeting of poor people by giving income supplements on condition the children attend school and keep their vaccinations up to date, an approach pioneered by the Brazilian Bolsa Familia programme which has reduced the poverty level of over 44 million people. More specific health care access for the poorest in American cities can be seen by the addition of new community clinics, either free standing or mobile. They provide health care in disadvantaged areas, where there are few local physicians or pharmacies, creating centres that provide a range of services, such as physicians, dentists, as well as counselling and health promotion units. In impoverished parts of American cities, school health education and provision is an increasing trend, which often includes the addition of clinics, especially aimed at targeting adolescents and their problems (Fitzpatrick and LaGory 2011) . A National Assembly of School-Based Health Care acts as a co-ordinating and advice centre for these school-based clinics which have grown from 20 in 1980 to currently over 2000 in 44 states. Research has shown that schools with clinics have fewer disciplinary problems, course failures, and school absentee rates, as well as better health (Smith 2013) . There is also a long history of the work of particular individuals acting as the catalysts for change in particular deprived areas. For example, a Venezuelan physician, America Bracho, was influential in establishing Latino Access, a non-profit organization in the poor, mainly Spanish speaking area of Santa Ana, an area of a third of a million people in Los Angeles, one of the toughest, crime-ridden, deprived and unhealthy areas in the country (LHA 2009). Also churches have long had a role in trying to cater for the poor and the disabled. In recent years more active approaches have developed, such as the Communities of Shalom (CS) set up by the United Methodist Churches. Initiated by the Rev Joseph Sprague it has now created hundreds of shalom zones in impoverished areas from its beginning in 1992, training local residents to promote health information and social capital. In many American cities some churches have partnered with public health departments in local universities to create facilities and programmes to improve the health of people in poor neighbourhoods, using trained local volunteers and specialists, such as the Congregations for Public Health in Birmingham, Alabama (CPH). All these examples represent the contemporary equivalents of the settlement houses of nineteenth century cities, designed improve the educational, health and economic prospects of people in impoverished areas, although these were mainly secular, as seen in work of Hull House in Chicago and many other cities (Adams 1910) . Another policy has tried to improve to areas of high disadvantage and ill-health by seeking to reduce concentrations of poor families, such as by scattering, instead of creating, public housing clustering, and by removing families to other locations where there are better facilities and more social mixture (Rubinovitz and Rosenbaum 2000; Comey et al. 2008) . Although there have been some successful resettlement schemes the scale of the areas of impoverishment means that it cannot be a solution for all areas. Space constraints mean that the examples discussed have only focused on developed countries to show that even in areas where there are extensive heath care programmes there is room for major improvements in targeting the most unhealthy places. A far, far bigger problem exists in the developing world where at least of the third of their population, over 860 million in total, live in informal shanty towns. Many of these places desperately need updated and modern versions of the types of services, from clean water supply to sewage and other utilities, that were created in the European urban improvements of the nineteenth century, deficiencies which lead to poor hygiene and debilitating illnesses. These informal settlements are characterized by high levels of deprivation, due to poverty, with flimsy informal housing-whose rents often take up a third of the non-food expenditure-unemployment, limited skills or education, low environmental quality, often with high crime, and with many residents, especially the elderly, having few social contacts and high levels of loneliness. Also, many of these areas lack proper property rights. This situation is often made worse by the presence of toxic outputs from nearly factories that ignore any environmental regulations even where they exist. But it is often argued that these slums, not all populated initially by rural migrants, should be welcomed as they provide the opportunity for improvement, unlike rural areas. Some have shown improvement, but the idea of the slum being a temporary waystation on the way to prosperity is increasingly being challenged. A large survey of slum dwellers (Marx et al. 2013 ) has confirmed that many people get trapped in these areas, leading to generational poverty and also illness, despite the opportunities that enable some to escape. Even in the cities of middle income countries there are still many slum areas, for example in Buenos Aires about 10 % of the population lives in the 56 shanty towns called villas miserias. Most of them lack schools, medical facilities, basic utilities and have poor drainage, so heavy rain causes flooding, and have limited employment and higher levels of ill-health. In 2011 the city created the Secretariat of Habitat and Inclusion to coordinate the efforts to incorporate these areas into the city and upgrade their facilities, but its efforts are handicapped by a small budget. One of its schemes is a so-called urban acupuncture programme that adds sports facilities, plazas and community centres to the deprived areas to act as focal and service points. However the programme has not been as effective as promised because the Secretariat is underfunded and is seen as a top-down agency. This often ignores the grass root organizations that have developed in these areas which emphasize that their real need is also for more adequate social housing and basic services, including health care. Elsewhere attempts are being made to provide more community facilities and to connect slum areas, which are often isolated on hillsides with few roads, with other parts of the city, such as by the gondolas in Bogota. In many other slum areas, such as in Rio de Janeiro, the problem of safety is paramount because of their control by local gangs. This is being tackled by what amounts to a blanket invasion by security services, followed up by stimulation of grass roots organizations, the provision of local community and a school facilities, and a permanent police presence to guarantee safety. Despite occasional attempts to improve the condition of people in these informal settlements there is still an enormous amount of work to do, given the scale of these problems and the limited funds and trained personnel of the city administrations. Indeed, at base the older developed world solutions to create healthier, man-made environments are still needed. Although modern medicine, especially for infectious disease has kept mortality rates low so far, the constant exposure to unhealthy conditions and stress in these slums, combined with the new growth of antimicrobial resistance mean that these areas could be the breeding grounds for future epidemics. This fear should speed up efforts to improve these often squalid environments. Improving the health of people in cities is part of their basic needs and a matter of social justice. It is one of the most important of the new policies contributing to liveable urban places. The old urban health disadvantages were solved by the nineteenth century engineering and regulatory approaches, and by the enormous advances in medical knowledge and care. Yet too many urban places still need basic sanitation, safe water supplies and adequate numbers of health care professionals and facilities. In addition new concerns have arisen about gaps in the delivery of health care, even in the developed world, with a need to address the societal structural conditions that lead to high inequalities and poor health. The new Healthy City programmes are creating positive changes through greater political involvement and community engagement, with policies designed to improve access to health care, such as spatial targeting and the use of Health Impact Assessments. The concept of `wellness' stresses the need for active health promotion and research into the impact of the many health determinants, especially the often downplayed environmental, economic and social domains. Although these new approaches are undoubtedly improving the health of people in many cities, they are not enough. It has been shown that urban living creates greater risks from such problems as traffic and pollution, but also from less time spent outdoors. One problem is that some pregnant mothers who have limited exposure to sunlight may be inhibiting the development of their foetuses, which seems to account for the recent increases in many autoimmune diseases. Another comes from an enormous recent rise in levels of myopia in young urban Chinese and other Asian children in particular, with 70% of children affected in some areas. Studies have discounted the general effect of genetic factors and shown that the children affected are not spending enough time outdoors, with too much time studying indoors and exposure to electronic devices. Control groups of Australian children and rural Chinese did not have this problem, which seems to be a consequence of a lengthening of the eyeball along its main axis due to a lack of exposure to sunlight (Lougheed 2014, Rose and Saw 2013) . Increasing challenges also come from the failures of the major antibiotic drugs that counter many diseases, which could reverse many medical advances. There are also increased risks from new communicable diseases in densely populated and interconnected urban places, as shown by the case of HIV. But a more recent example is seen by the rapid spread of Ebola in West Africa, with over 5,000 deaths and at least 14,000 cases by the end of October 2014. 40 As Rene Dubois (1959) observed in many books, nature in the form of some microbe will always strike back at human bodies in some unpredictable way, for new viruses constantly emerge and existing ones mutate and become resistant to drugs. Ebola's rapid spread was due to its ability to mutate rapidly and suppress human immune systems. But it was also a consequence of several environmental and cultural factors: the inability to recognize its initial appearance, probably from cross-species infection from the viruscarrying fruit bats; its spread in areas of high density, poor sanitation, high mobility levels, and very limited health care systems in personnel and facilities; a lack of trust in authorities after years of war; and where traditional healing and funerary practices involved a lot of touching, which led to greater spread since bodies are especially infectious by extruding the body fluids that spread the virus. The heroic efforts of the charity Médicine sans Frontières helped many people, but did not have the resources to contain the outbreak, with an often 70% death rate among infected people. Unfortunately it took over six months for the WHO and other countries to start a massive effort to try and contain the disease by providing: supplies of soap, alcohol swabs and disinfections to help reduce Ebola's spread; thousands of trained doctors and nurses to attend the sick; enough protective clothing; building isolation hospitals to treat patients; as well as tracing the contacts of infected people, which was impeded by the absence of detailed local maps. This outbreak showed we cannot be complacent about the threat of new diseases. There is pressing need to have enough of the needed facilities available to be sent quickly to areas where new diseases are detected and a necessity to boost research to find vaccines and drugs to combat such new diseases and older ones like malaria that are still endemic. Many of these problems go beyond the ability of cities alone to solve them, although their density and connections make them more vulnerable, especially those with poor sanitation. Cities rarely have the power or finance to improve health care on their own, but they can initiate new programmes to improve their facilities. Yet their very size means that their politicians and citizen groups should be able to mount successful lobbying of national governments to take the more active, comprehensive and multifaceted measures to improve health in urban settlements. Twenty years at Hull House Healthy places and healthy regimens: British spas 1918-50 Socioeconomic status and health: What we know and what we don't Alberta's strategic approach to wellness: Health for all Redefining public health in New York. The Lancet The healthy city and the ecological idea Prisoners of hate: The cognitive basis of anger, hostility and violence Environment and health The health impact assessment of development projects Towards a new biology of social adversity Do neighbourhoods influence child and adolescent development Loneliness: Human nature and the need for social connection Leagues of sunshine; sunlight, health and the environment Antimicrobial resistance threats in the United States, 2103. Centre for Disease Control and Protection Report on the sanitary condition of the labouring population of Great Britain Utopia on trial Struggling to stay out of high poverty neighbourhoods Toward the healthy city Levelling up: European strategies for tackling social inequalities in health. Copenhagen: WHO Regional Office for Use of health impact assessment in incorporating health considerations in decision making Charles Booth and the measurement of urban social character Road transport and health. London: British Medical Association Affective dimensions of urban crime areas The drugs don't work Communities within cities How community associations vary Our health, our care and our say Month of birth, vitamin D and risk of immune-mediated disease: A case control study Mirage of health: Utopias, progress, and biological change Sick building syndrome fact sheet: Air facts Why you should be concerned about the quality of the air you breath Disease and intelligence The condition of the working class in SARS: Chronology of the epidemic Accelerated telomere shortening in response to life stress Rapid HIA of the Royal Liverpool Children's NHS Trust's Outline Business Case for the modernisation of children's services Unhealthy cities: Poverty, race and place in America Towards a developmental understanding of violence Natural and political observations mentioned in the following index and made upon the bills of mortality Sunlight robbery: Health benefits of sunlight are denied by current public health policy in the U.K. Health Forum Occasional Reports GNAFCC: Global network of age-friendly cities and communities Indoor environmental quality American cities: Good and bad news about public health Vitamin D deficiency and reduced lung function in connective tissue and associated interstitial lung diseases Cities of tomorrow Housing and health in early modern London Black death and the transformation of Europe Cured The ghost map Healthy cities: A guide to the literature Sports and exercise as contributors to the health of nations Could combating vitamin D deficiency reduce the incidence of autoimmune disease? Heat wave: A social autopsy of disaster in Chicago Mission and method: The early C19 French public health movement National healthy cities networks: A powerful force for health and sustainable development in Europe A new perspective on the health of Canadians. Ottawa: Dept. of National Health and Welfare Liverpool: Liverpool City Council and Liverpool Primary Care Trust Effect of physical inactivity on major non-communicable diseases worldwide London and Londoners: Making the links for health. London: London Health Commission Is vitamin D deficiency to blame for the asthma epidemic The in utero effects of maternal vitamin D deficiency Myopia: The evidence for environmental factors A new health strategy for Liverpool: The outside of hospital strategy. Liverpool: Liverpool Primary Care Trust Seasonal affective disorder. American Family Physician Brain on stress: How the social environment gets under the skin. Proceedings of the National Academy of Science Lead and conduct problems: A meta analysis An introduction to city planning: Democracy's challenge to the American city Economics of slums in the developing world London labour and the London poor Social disadvantage, genetic sensitivity, and children's telomere length Childhood rickets in back while vitamin D sits in warehouses. The Guardian Vitruvius: The ten books of architecture (trans: M. H. Morgan) Global measures of malaria Potential for primary prevention of Alzheimer's disease: An analysis of population-based data Prevalence of childhood and adult obesity in the United States Towards healthy cities: Comparing conditions for change Vitamin D hormone regulates serotonin synthesis: Relevance for autism. Federation of American Society for Loneliness in older persons: A predictor of functional decline and death Charles Booth on the city: Physical patterns and social structures The Indus civilization: A contemporary perspective Dementia incidence and mortality in middle income countries Preventing disease through healthy environments Fruits and vegetables consumption and risk of stroke: A meta-analysis of prospective cohort studies Expression of the multiple sclerosisassociated MHC class II allele HCA-DCBI*1501 A ChIP-seq-defined genome-wide map of vitamin D receptor binding: Associations with disease and evolution The making of urban America Environmental policy as social policy? The impact of childhood lead exposure on crime Myopia, lifestyle, and schooling in students of Chinese ethnicity in Singapore and Sydney Crossing the class and colour lines The Hanbuch der Hygiene: A manual of proto-environmental science in Germany Health impact assessment Health impact assessment: Theory into practice Expanding school-based care The conquest of malaria 1900-62 Charles Dickens: A life. London: Penguin. TRI: Toronto Rehabilitation Institute Maternal diet vs lack of exposure to sunlight as the cause of the epidemic of the asthma, allergies and other autoimmune diseases Constitution. Geneva: World Health Organization Types of healthy settings Health impact assessments: Main concepts and significant applications. Brussels: World Health Organization Regional Office Obesity: Preventing and managing the global epidemic. Geneva: World Health Organization Age-friendly cities: A guide Global status report on non-communicable diseases. Geneva: World Health Organization Global analysis of sanitation and drinking water. Geneva: World Health Organization Ambient air quality and health. Geneva: World Health Organization Dementia: A public health priority. Geneva: World Health Organization and Alzheimer's Disease Institute Ottawa charter for health promotion Health 21: Health for all in the 21st century Mortality by neighbourhood income in urban Canada Social determinants of health Social determinants of health: The solid facts. Copenhagen: WHO Regional Office for Europe Well London strategy, delivering Well London The implication of vitamin D and autoimmunity: A comprehensive review Alcohol and mortality in Russia: Prospective observational study of 151,000 adults. The Lancet Effectiveness of a novel ozone-based system for the rapid high-level disinfection of health care spaces and surfaces