key: cord-342939-b7qn6ynk authors: Baillie, L.; Dyson, H.; Simpson, A. title: Dual Use of Biotechnology date: 2012-01-03 journal: Encyclopedia of Applied Ethics DOI: 10.1016/b978-0-12-373932-2.00430-0 sha: doc_id: 342939 cord_uid: b7qn6ynk This article addresses issues that are central to the dual use of biotechnology, such as the public perception of risk and the need for physical containment to prevent the release of potentially dangerous microorganisms. It also examines the public and media perception of the scientists who handle and manipulate these pathogens and discusses the controls that are currently in place to ensure that scientists engaged in defense-related dual-use medical research act in a transparent and ethical manner. Finally, the article discusses what can be done by scientists to allay the fears of their fellow citizens. Research in the area of life sciences and biotechnology has the potential to bring great benefit to humankind. In a relatively short period of time, the life sciences have evolved from a simple cataloguing exercise of the diver sity of nature to a position in which researchers are adding to that diversity through the construction of modified and potentially novel life-forms. The vast majority of this activity has had a positive impact on the quality of life of at least some of the human race. Indeed, the past 150 years have seen major advances in the fields of micro biology and biochemistry, and these have been followed by the emergence of the disciplines of immunology, molecular biology, and genetics. In practical terms, this has resulted in the introduction of sewers and clean water, the development of antibiotics and vaccines capable of eliminating infectious diseases such as smallpox, and the ability to create genetically modified organisms able to synthesize production-scale quantities of human hor mones such as insulin. Indeed, on a daily basis biomedical researchers manip ulate microorganisms in an effort to understand how they produce disease and to develop better preventative and therapeutic measures against the infections they cause. The efforts of plant and animal biologists using similar techniques to improve agricultural yields have resulted in the development of disease-resistant crops and transgenic animals. Some of these species have transitioned from the confines of the laboratory into mainstream agriculture in countries such as the United States and India. On first Disclaimer: Any views expressed are those of the authors and do not necessarily represent those of Dstl, Ministry of Defence, or any other UK government department. inspection, these emerging technologies hold enormous potential to improve public health and agriculture, strengthen national economies, and close the develop ment gap between resource-rich and resource-poor countries. However there is also a potential dark side to this benign picture. Throughout human history, every major new technology has been used for hostile purposes, and thus it would be naive to believe that the life sciences might not be similarly exploited for destructive purposes by state-sponsored biological warfare programs or by individual terrorist or doomsday groups. Research with the potential to be misused for illicit purposes is said to be 'dual use.' Simply stated, the techniques needed to engi neer a bioweapon are the same as those needed to pursue legitimate research. There are also concerns that rapidly advancing technological possibilities could enable the creation and production of unforeseen new biological threats with uniquely dangerous but unpredictable characteristics. A key challenge faced by regulatory authorities is the need to balance legitimate public concerns over the mis use of life sciences against the enormous potential that they have to benefit humankind. Getting this balance right will be central to ensuring that governmental actions do not impose blanket restrictions and cumbersome rules on scientists that stifle legitimate research and reduce industrial competitiveness while having little impact on real security. It could be argued that any new regulations specific to dual use of biological technologies would be largely inef fective because they would only affect scientists working in government-funded laboratories, who already follow very stringent rules. Indeed, even if new regulations were implemented, it is debatable as to how effective they would be. The anthrax attacks in 2001 in the United States are thought to have been undertaken by a 'regu lated' lone U.S. government scientist working in a government-controlled facility. Does this mean that we need new regulations, or does it suggest that regulations alone are likely to be ineffective? It is also a fallacy to believe that life science research is limited to government-regulated facilities; indeed, the technology has reached a stage at which an individual with a graduate-level education, access to the Internet, and a credit card can set up a garage laboratory anywhere in the world. The emergence of organizations such as DIYbio is a testament to this new movement. This spon taneously formed community of more than 2000 individuals is in the process of establishing community laboratory spaces in major cities throughout Europe and the United States to enable their members to carry out their own 'hobby research.' How can these free spirits be assisted or regulated to ensure that both their own safety and that of the community in which they live and experi ment remain secure? An approach proposed by a number of advocates has been to encourage life scientists to take the lead in tack ling the issue of dual-use technology. Indeed, some have stated that these scientists have a moral obligation to prevent the misapplication of their research because they are believed to be in the best position to understand the potential for misuse. Although the validity of this argument is debatable, it is also extremely unlikely that the average research scientist will have more than a hazy comprehension of the factors important in developing an effective bioweapon. This view does, however, point to the need for life scientists to move more to center stage and proactively engage with both the public and the security and regulatory communities to ensure that the control systems that are ultimately adopted are both proportionate and likely to be relevant in the real world. It should not be forgotten that the reason for these control measures stems from a desire to protect the well being of the general public. Although it is highly unlikely that they will understand the intricacies of the research, it is important that they support the outcome that the researchers are trying to achieve. Indeed, the support and tacit consent of the general public and their elected representatives is essential in the development of propor tionate regulatory systems. Unfortunately, scientists in general, and particularly those engaged in defense and industry-funded research, have a poor track record in communicating the impor tance of their research to fellow citizens. This is primarily due to the constraints imposed on them by their parent organizations, but it also flows from a lack of understand ing of science among the media industry generally, and particularly the popular press, which often results in incomplete and inaccurate reporting. As a consequence, this perceived lack of openness has created an atmosphere of suspicion in which conspiracy theorists, the media, and Hollywood thrive, routinely conjuring up lurid images of evil scientists working on government-funded Frankenstein projects to destroy the world. It is thus perhaps not surprising that public perception of scientists and their motives may not be as positive as it once was. This article addresses issues that are central to this theme, such as the public perception of risk and the need for physical containment to prevent the release of potentially dangerous microorganisms. It also examines the public and media perception of the scientists who handle and manipulate these pathogens and discusses the controls that are currently in place to ensure that scientists engaged in defense-related dual-use medical research act in a transparent and ethical manner. Finally, this article discusses what can be done by scientists to allay the fears of their fellow citizens. Although microorganisms capable of causing disease are widespread in the environment, medical, technological, and economic advances have, to a large extent, shielded individuals in the developed world from their adverse effects. Notable examples include the reduction in the incidence of (1) puerperal fever and surgical sepsis in the nineteenth century following understanding of the modes of transmission of bacterial infection, (2) enteric fevers due to improvement in sanitation, and (3) food poisoning due to better education and food preparation practices. While there are still intermittent outbreaks of food poisoning in the United Kingdom, the real concern is that the overuse by the farming industry of powerful antibiotics to promote animal growth could result in the emergence of multi-drug resistant pathogen bacteria making them increasingly difficult to treat. The importance of a society's organizational and tech nological status in mitigating the effects of disease is well demonstrated by the contrasting fortunes of New Zealand and Haiti following earthquakes in 2010. Although the earthquake that hit South Island at 04.35 local time on September 4 was of a similar magnitude (7.1 compared to 7.0) to the one that struck Haiti at 16.53 on January 12, the outcomes for the two populations have been remarkably different. There were no fatalities in New Zealand (only two people were admitted to hospital in the immediate aftermath), and there have been no epidemics, despite disruption of sewage and water supply systems. This contrasts with a high initial death toll (230 000), many injuries (300 000), catastrophic disruption of Haitian society (1 million people made homeless), and an epi demic of cholera in the displaced population. Aspects of a society that determine its resilience to major disasters include the general health and education of its popula tion, its technological infrastructure, the state of readiness of its societal organization to respond to the event con cerned, and its political and governance structures. New Zealand and Haiti appear to be at opposite ends of the spectrum for all of these criteria. A particular infection of a certain severity may have widely different impacts on an individual depending on the person's general health and specific circumstances. Thus, an enteric infection in an undernourished child in the medical center of a refugee camp outside Port-au-Prince could well prove fatal (particularly because the child is likely to be only one of many), whereas a similar infection in a healthy child in Christchurch might be overcome with little more than good nursing care from the child's parents at home. Perception of the risks of such infection also varies considerably according to a society's recent experience; childhood deaths from enteric infec tions are an accepted fact of life in many poor areas of the world, whereas in richer, technologically advanced areas they are not. Typically, as the prevalence of infectious disease decreases over time in a society, concern regarding rare and particularly novel infections increases. Thus, fre quent but relatively mild infections (e.g., the common cold) may inflict a significant overall burden on a society in terms of general ill-health, use of health services, and loss of economic activity without arousing much outcry from the general population. In contrast, a rare but severe treatable infection such as methicillin-resistant Staphylococcus aureus may cause much consternation in the media but actually inflict a much smaller overall burden on society. Novel or emerging infections (e.g., severe acute respiratory syndrome (SARS) and swine flu) may have major economic and societal impacts worldwide, with an actual disease burden that is a minute fraction of that caused by well-known diseases such as malaria and tuberculosis. Perhaps counterintuitively, the fear engendered by rare diseases in a society appears to be inversely related to the actual disease burden that they impose. This fear will clearly be modified by experience of the disease; thus, SARS was rightly to be feared and swine flu less so. By extension, an unknown infection can cause disproportion ate fear in a population, exaggeration by the media, and the risk of overreaction by the authorities. The 2001 anthrax letters episode in the United States was an exam ple of a major response to a relatively small overall disease burden (20 infected and 5 deaths in a country of 311 million people), but it demonstrates the fact that a society's response to an unexpected human-originated event may have a much greater impact on the society than the event itself. Media-dominated, Internet-connected, technologically advanced, economically developed areas of the world are therefore more prone to exaggerated responses based on fear of a horrifying unknown than are those areas that are less privileged. The controversies in Europe regarding genetically modified (GM) crops (feared in Europe for ideological reasons but welcomed by more pragmatic societies in India and the United States for the increased yields they bring) highlight the fact that advanced societies may, for cultural reasons, have different views of the risks associated with certain technologies. Although all advanced societies would be expected to have a marked fear of the sequelae of a deliberate release of infectious organisms, those with an already heightened fear of biotechnology might be more prone to extreme reactions. Deliberate release of harmful biological material would provoke a number of emotions, including fear of the unknown, the ancient fear of plague or conta gion, anger and fear of malicious human action, anger directed at law enforcement agencies for failing to prevent the event, and anger at politicians for possibly provoking the event. A release of dangerous biological material from a research laboratory would provoke many similar emotions, although anger would be direc ted more at the incompetence of those operating the laboratory and at the relevant authorities for failing to prevent it. So what is the real risk of an accidental or deliberate release of dangerous biological material in the United Kingdom? Fortunately, escape of infectious material from laboratories is very rare; examples include a small pox outbreak in Birmingham in 1978 and the foot-and mouth disease outbreak associated with faulty drainage at the animal health facility at Pirbright in 2007. The root cause of such accidental releases was a breakdown in containment (the physical control measures put in place to prevent microorganisms escaping to the environment). In fact, containment technology and practices have improved dramatically during the past 50 years, with significant improvements often being identified by analy sis of accidents or near-misses. At the Porton Down site, which houses both Ministry of Defence and Department of Health microbiological containment laboratories, there have been only two cases of laboratory-acquired infec tion; these occurred in the 1960s and both were the basis for considerable improvements in procedures. In the United Kingdom, there have been no known deliberate releases of biological material. There have been deliber ate releases of infectious material in other countries -the rarity of such events has probably contributed to their celebrity status. Given the extreme rarity of such events, why are they so feared? It is instructive to compare the annual inci dences of certain other commonly accepted events using headline statistics relating to work-related ill-health and accidents in the United Kingdom during 2009-10: • Ill-health: 1.3 million people who worked during this period were suffering from an illness (long-standing as well as new cases) that they believed to be caused or made worse by their current or past work. A total of 555 000 of these instances were new conditions that started during the year. An additional 0.8 million former workers (who had last worked more than 12 months previously) were suffering from an illness caused or made worse by their past work. A total of 2249 people died from mesothelioma in 2008, and thousands more died from other occupational cancers and diseases. • Injuries: 152 workers were killed at work -a rate of 0.5 fatalities per 100 000 workers. A total of 121 430 other injuries to employees were reported -a rate of 473 per 100 000 employees. A total of 233 000 reportable injuries occurred, according to the Labour Force Survey -a rate of 840 per 100 000 workers. • Working days lost: 28.5 million days were lost overall (1.2 days per worker), 23.4 million due to work-related ill-health and 5.1 million due to workplace injury. Thus, real risk is very different from perceived threat, which may sometimes appear greater the rarer the event (and hence the lower the probability of actually experi encing that event). Data for England and Wales for 1989 indicated that the more common avoidable causes of death (e.g., cardiovas cular disease due to smoking and obesity) carried a risk of 1 in 190 compared to a risk of 1 in 700 000 for spectacular events, such as railway accidents, that generally attract media attention. These risks were calculated retrospec tively from the reported causes of death during that year. The risk of dying in England and Wales from infection due to bioterrorism in that year was zero (as it was in 2009). However, when looking forward into an uncertain future, many more factors than likelihood affect the per ception of threat, and it may be that the very rarity of an event adds to its perceived impact, making it more inter esting to society at large and therefore much discussed in the media. It is clearly the case that if these rare events remained unreported, the public would not dread them so much, but such censorship would not be acceptable in a democratic society, and the media should take a respon sible approach to explaining real risks and suggesting appropriate and proportionate precautions to mitigating them. Although past experience suggests that release of danger ous biological material, whether accidental or deliberate, from facilities is extremely rare, it is important that we consider how such an event could occur in the future. The most likely routes of escape are following an accident in a laboratory (hospital, academic, government, or commer cial research) or as a consequence of defective physical containment processes or equipment as occurred at Pirbright in 2007 when foot-and-mouth virus was released to the outside world. The mainstay of preventing release of dangerous bio logical material rests on principles of biosafety, biocontainment, and biosecurity. Biosafety covers the procedures needed to work safely with hazardous organ isms. Biocontainment includes the measures (facilities, equipment, and apparatus) within which work on these organisms can be carried out safely without danger of release into the environment. Biosecurity is the process of ensuring that the whereabouts of hazardous organisms are known and tracked and that access to them is restricted to appropriately authorized personnel. These principles actually apply more widely to other human activities, including hygienic preparation of food, supply of clean drinking water, safe processing of sewage, sterile procedures in surgery, and safe operation of hospital microbiology laboratories, as well as the more obvious situations of microbiological research laboratories. Welldesigned facilities and procedures both facilitate the con duct of good science and minimize the opportunities for accidental misuse. Unfortunately, although good engineering can reduce the physical risk of pathogen release, it cannot stop a researcher from deliberately removing material for his or her own use. The motives for such an action could include ideology (extremist apocalyptic, Islamist, or ani mal rights philosophies), blackmail by members of an extremist group, disorders of perception (mental illness or desire for revenge against society following some real or imagined disadvantage), or severe disaffection with employers or colleagues. The 2001 U.S. anthrax mail attacks represent just such a case, in which anthrax spores alleged to have been deliberately removed from a U.S. government defense research facility by a government scientist were used to carry out indiscriminate attacks against the general population. Although extremely rare, this event is likely to have had a major impact on the public perception of scientists engaged in defense-related research and their motivation. Scientists have an image problem. The charming and charismatic scientist is not an image that permeates popu lar culture. Although it is common for the entertainment industry (and news media should be included in this category) to portray professions such as medicine, law, and journalism as exciting and glamorous, scientists are often depicted as unattractive, reclusive, socially inept white men or foreigners working in dull, unglamorous careers on projects that could destroy the world. Indeed, there is evidence that this impression may be imprinted in childhood and once established is difficult to modify. The reasons for this stereotyping are complex but can be broken down into two main areas: a failure to grasp the nature of the scientific process on the part of the public (education) and a failure to present their message in an accessible manner on the part of scientists (communica tion). These tendencies are compounded by an understandable desire on the part of the entertainment industry to produce content that is popular and profitable. The issue of how our children are taught basic science is an area of obvious concern, as highlighted by the observation that approximately 70% of adult Americans do not understand the scientific process and have to depend on others to help them understand the signifi cance and consequences of scientific advances. In the advanced economies, the major source of information is television, whereas the Internet (another unregulated environment) is increasingly used to research specific scientific issues. Given the importance of these media in 'educating' and shaping public opinion, how good are scientists at ensuring that their message is getting across? It is safe to say that whatever they are doing, it is not having the desired effect. Part of this failure is due to the inability or reluctance of practicing scientists to engage with the media in such a way as to convey their story in a form that is understandable by their fellow citizens. A survey com missioned and funded by the Wellcome Trust found that the majority of scientists believed that the public saw them as detached, poor at public relations, secretive, and uncommunicative. Furthermore, they identified a lack of knowledge and/or interest in science within the general public as a major barrier to communicating concepts and ideas. Most of those questioned believed that they were insufficiently trained to deal with the media; more impor tantly, the majority of scientists surveyed distrust the role of the mass media in communication of their results. The role of the mainstream media and popular press is primarily to entertain their customers and make money. In that light, it is not surprising that there is a tendency to focus on stories and issues that seize public attention. All journalists know that scares make good stories and fre quently generate a momentum of their own that does not require any facts to keep them moving forward. For example, in recent years we have seen the emergence of numerous scare stories in the media (flesh-eating bacteria, falling sperm counts, chlorofluorocarbons, bovine spongi form encephalopathy, harmful GM foods, etc.), many with little in the way of scientific evidence to support them. It is perhaps not surprising, then, that a climate has been created in which the ordinary person regards scientific developments with suspicion, having the under lying assumption that he or she is being put at risk by reckless scientists operating in an uncontrolled manner in their ivory towers. This perception is not helped by Hollywood, which provides a seemingly endless diet of disaster films in which dastardly government scientists are either blowing something up or pursuing genetic experiments in a top secret government laboratory to produce new species that could escape and destroy the world. When was the last time a blockbuster film was released in which a dedicated scientist carried out an experiment that did not involve a chiseled-jawed hero saving the day? A further element that may contribute to the public distrust of science is the rise of pseudoscience, which includes topics such as astrology, alternative medicine, yogic flying, and UFOs. Indeed, it has been suggested that the entertainment industry (e.g., the popular X-Files series) is partially responsible for the large numbers of people who now believe in astrology, ESP, alien abduc tions, and other forms of pseudoscience that contribute to the scientific illiteracy of the public. Against such a cultural backdrop, it is not surprising that the public has little problem in believing that government scientists employed in defense-related research are not to be trusted. As a consequence, scientists, particularly those engaged in research considered dual use, find themselves in an almost impossible position when trying to explain their research and allay understandable fears harbored by the public. So why are members of the public concerned about research sponsored by the defense community? Many nations view research into the development of medical countermeasures (MCMs) against biowarfare agents as an essential element of risk reduction. Although civilian and military populations are equally susceptible to the same biological agents, the relative risk of exposure differs markedly. Thus, although there is considerable common ality in the research priorities of each group, some biological threats, such as anthrax and plague, are cur rently seen as being more relevant to the military. In addition, the nature of the work undertaken by the mili tary and the environment in which it operates are likely to influence how and when MCMs are administered. For example, the military may consider immunizing troops with a new vaccine prior to deployment as the most effective means of protecting individuals and ensuring operational effectiveness in a high-risk environment. In contrast, the civilian authorities are more likely to treat with antibiotics after an outbreak has occurred rather than vaccinate large numbers of the public against a disease with a very low likelihood of a deliberate release (albeit very high impact), such as anthrax. Thus, differences in the relative risk of exposure of each target population are a major driver of the research undertaken by defense scientists. To develop MCMs capable of dealing with biothreat agents such as anthrax, there is inevitably a requirement to handle and manipu late these dangerous pathogens, which in turn generates concerns, rightly or wrongly, about the possibility of their inadvertent release or potential misuse (dual use). Indeed, these concerns derive partly from the fact that the government-sponsored organizations currently devel oping defensive MCMs were engaged many decades earlier in the development of offensive biological weap ons. Although this research was discontinued in the 1960s in the United Kingdom, there are still concerns, in some quarters at least, regarding the potential for this type of work to be resurrected. When the public's mistrust of politicians and scientists is added to this mix, it is not difficult to understand why people are willing to believe the worst. Indeed, the perceived lack of 'public visibility' of defense research further stimulates the public imagination despite the fact that the results of this research are widely disseminated through peer-reviewed journals and at international conferences. The nature of modern research is such that it is rare to find a project that does not require collaboration with academic and/or industry-based part ners, thus ensuring at least some degree of scientific visibility. In addition, the regular inspection of defense research facilities by national regulatory agencies or under the auspices of international treaties is an attempt to alleviate some concerns. Openness, combined with inspection by independent scrutinizers, is an important tool in tackling dual-use concerns. If one accepts that defense-related research is war ranted, then how does one justify the development of a new medical countermeasure costing millions of dollars to protect against an event that may never happen? This is particularly important given that any new MCM must first undergo clinical trials in human volunteers to demonstrate both safety and efficacy. This will require the exposure of healthy individuals to an experimental treatment that carries with it the risk of adverse reactions. Given that this is a man-made risk, how can this be justified? Fortunately, in countries such as the United Kingdom, these decisions are taken out of the hand of the defense scientists. Indeed, investigators conducting clinical trials need to justify their study to an independent research ethics committee, which determines if any potential health risks to trial participants are justified. A key element in the committee's deliberations is to determine if there is a real-world justification for the new MCM; thus, the committee represents an important reality check. Once the study has received approval, it is subjected to further scrutiny at the national level in the United Kingdom by the Medicines and Healthcare Products Regulatory Agency (MHRA). Each of these layers of control has the power to stop a clinical trial if it is concerned that an ethical breach has occurred and thus plays a key role in preventing inappropriate research. Although regulatory scrutiny is vital to prevent harm to volunteers, a further level of protection is provided by the financial realities of drug development. The cost of bringing new MCMs to market is considerable, amount ing to hundreds of millions of dollars, and as a consequence, the engagement of the pharmaceutical industry is essential. Drug companies are focused on making money and given the relatively small size of the military market will only invest significantly in the devel opment of MCMs that could also be used to protect civilian populations. Thus, any MCM derived from mili tary research will undergo intensive public scrutiny on route to being licensed and, as a consequence, will be exposed to intense public and financial scrutiny. Even when an MCM has been approved for human use, there are still questions regarding its administration to service personnel. For example, should immunization with biodefense-specific MCMs such as the anthrax vac cine be mandatory as it is the case for the U.S. Army? This raises issues of military governance and consent to treat ment, which can only be dealt with by the relevant law in each country. If new MCMs are being developed, there needs to be consideration of the target population in advance of likely use. It would be wasteful for defense research to develop new drugs that would not be accep table to service personnel and would therefore be effectively unusable. Public concern regarding dual-use issues and the ethics of performing defense-related research has led to the instigation of a range of checks and balances in the United Kingdom designed to reduce risk to a minimum. The effectiveness of these measures is rightly open to public debate, and it is hoped that future scientists, as well as members of the public, will be encouraged to make a full and active contribution to this debate to ensure that future regulatory decisions are based on evidence rather than driven by popular misconception. Biological material with the capacity to cause harm can be found in a range of different institutions (hospital, aca demic, pharmaceutical, and government establishments, both civilian and military). Potentially, such organisms could be released into the environment following unfore seen accidents, due to negligence, or by deliberate intent. However, experience to date shows that the actual like lihood of human infection as a result of deliberate or accidental release is vanishingly small, particularly com pared to that of contracting infections naturally or suffering harm from other types of accidents or being the victim of a criminal or terrorist assault of some kind. The disproportionate fear that the threat of such infec tions arouses in the general population reflects a lack of understanding of the nature of risk, hazard, and proba bility, coupled with an understandable tendency of the popular media to exaggerate the impact of rare or ima gined spectacular events. The research undertaken in organizations in which microorganisms can be found has produced results of enormous benefit to human society in terms of improving health outcomes (better sanitation, advances in medicines, and vaccines) and increasing safety and efficiency of food production. Future developments in biotechnology hold the promise of major benefits to humanity in such diverse fields as mitigating the impact of climate change, improv ing agricultural yields in poor areas of the world, synthesis of novel materials on an industrial scale (e.g., biofuels), and the discovery of cures for major scourges such as tuberculosis and malaria. How do we balance the enormous potential for good that biotechnology offers against concerns regarding its misuse? In the United Kingdom, the vast majority of microbiological research is performed in civilian organi zations, with only a very small fraction being conducted by defense laboratories. Research activities in defense and civilian facilities in the United Kingdom are carefully regulated by a number of statutory bodies such as the Health and Safety Executive, which monitors studies involving genetically modified organisms, and the Home Office, which oversees experiments involving animals. Clinical trials involving human volunteers are regulated by the MHRA and are overseen by research ethics com mittees (which are themselves approved by the UK Ethics Committee Authority). However, more regulations, such as intrusive psychological profiling of staff working in microbiological laboratories or heavy-handed, overbear ing, rigid assessment programs of scientific staff to 'ensure' reliability, are unlikely to further reduce the probability of an already extremely unlikely event. Rather, they are more likely to alienate well-motivated staff, thereby sti fling research and the development of products and techniques that could bring major benefits to the United Kingdom and humanity as a whole. Indeed, fostering a supportive community among well-rewarded and appre ciated scientists and staff would make it much easier to detect early signs of unhappiness, social problems, or the unacceptable behavior of individual researchers. Such an approach would also be expected to produce better scien tific outcomes. The role and responsibility of scientists is central to minimizing misuse of technology, and thus it is vital that life scientists are encouraged to take ownership of this problem and in doing so assume a more proactive role in regulating, communicating, and explaining their activities to the wider public. Unfortunately, to date, the majority of scientists have demonstrated a marked reluctance to fill this role for the reasons outlined previously. It has been suggested that improved education of scientists, the media, and the public would go some way toward addressing this issue. Improving the aware ness of scientists could take many forms, such as the inclusion of teaching material covering biosecurity and dual-use issues into the curriculum of all life science undergraduates and in seminars, conferences, and pub lications dedicated to the subject. Scientists, particularly those engaged in areas of research that have the potential for misuse, must be encouraged to communicate the nature of their research as widely as possible to their fellow citizens. The most obvious vehicle through which to achieve this aim is the mass media, which is in a position to be a creative and positive influence in bring ing scientists and the public together around these issues. It has the capability to improve communication and understanding, reducing unwarranted fears and sensa tionalist reactions to imagined threats. How we achieve this utopian dream in the face of the economic realities of a 24/7 multimedia society is a question beyond our powers. The Committee for Skeptical Inquiry uk -The Health and Safety Executive Les Baillie is a professor of microbiology within the Welsh School of Pharmacy of Cardiff University. Prior to joining the university in 2007, he was Director of the Biodefence Medical Countermeasures Department based at the Naval Medical Research Center in Washington, DC. In this role, he led a multidisciplinary team developing novel therapeutics to combat the threat posed by biothreat agents. This research built on previous experience gained at Dstl Porton Down, where he led a research team working on anthrax. Having worked in both government defense and academic laboratories on two conti nents, he has a unique insight into the challenges faced by life scientists engaged in research in this area.Dr. Hugh Dyson is a principal medical officer at Dstl Porton Down. He has previously worked in the National Health Service and academia, holding posts in renal medicine and pharmacology.Dr. Andrew Simpson is a clinical microbiologist at Dstl Porton Down. He has previously held posts in the National Health Service and academia, and he worked for many years in Thailand at the Mahidol Oxford Tropical Medicine Research Unit.