key: cord-0035044-0wxieyjq authors: Cole, Leonard A. title: Ethics and Terror Medicine date: 2009 journal: Essentials of Terror Medicine DOI: 10.1007/978-0-387-09412-0_25 sha: 9163fef07f46663586d5785cade1b71625c4fc85 doc_id: 35044 cord_uid: 0wxieyjq The field of medicine has long been defined not only by diagnostic and treatment techniques but also by standards of behavior. The Hippocratic Oath was introduced about the same time as the concept of case histories and prognosis, in ancient Greece, fifth century BCE. Despite vast changes in medicine through the ages, the oath’s core message continues to resonate: that a physician has a special responsibility to perform honorably. Forms of the Hippocratic Oath are still recited during graduation ceremonies at medical schools, many in the United States, though the classical version has been altered to suit contemporary values.(1) For example, passages in the early oath that prohibited the practice of abortion or euthanasia now are commonly omitted. The shifting text is a reflection of attempts to accommodate medical ethics to new findings, experiences, and values. Nuremberg Code also proved to be a forerunner of enhanced sensitivity in general to the rights of patients and the responsibilities of physicians. 3 Prior to this period, in the United States and elsewhere, medical doctors were often viewed as an insular and exclusive fraternity. The American Medical Association (AMA), since its inception in 1847, has periodically issued codes of medical ethics. An early twentieth century version noted the responsibility of physicians to their patients, but recited at greater length "the duties of physicians to each other and to the profession." 4 Thus, no less important than the care of patients was the felt need to nurture the privileged status of the profession. This sense of paternalism was also reflected in the code's admonition that doctors ordinarily not give patients complete information about their condition. By the 1950s, the notion of elitism as an ethical principle had diminished and the AMA's 1958 principles of ethics made no mention of doctors' obligations to each other. 5 Subsequent cultural changes in the general society further eroded paternalistic attitudes in favor of the rights of the individual. During the 1970s, the field of bioethics grew in response to information about pre-and postwar experiments on unwitting human subjects by agencies of the US government. The Public Health Service had dispensed placebos to syphilitic black men in Alabama in order to observe the course of their untreated disease. 6 The Central Intelligence Agency had secretly dropped mind-altering drugs in the drinks of citizens to watch their reactions. 7 The Atomic Energy Commission had sponsored experiments in which unwitting hospital patients were injected with plutonium. 8, 9 These experiments clearly violated the informed consent precept of the Nuremberg Code, though they had ended by the time the public learned about them. Still, the belated revelations prompted outrage and generated strengthened legal and institutional safeguards to protect human subjects. At the same time, in the world of medicine, the notion of patient self-determination had gained further traction. In 1980, the AMA issued a revised code of ethics that for the first time included an admonition that a physician must "respect the rights of patients." 10 This ideal was becoming firmly entrenched as a matter of principle no less than the earlier notion of medical paternalism once was. The trend was abetted by the burgeoning recognition of the AIDS crisis, as civil libertarians joined with homosexual rights activists to press for protection of privacy in the public health sphere. They successfully rejected efforts toward enforced testing for HIV and for several years even blocked a requirement in the interest of public health that physicians report people with HIV infection. 11 Critics who worried that a boundless primacy of individual rights could harm the general welfare, seemed to be a dwindling minority. But the simmering tension between individual and community rights resurfaced over the issue of terrorism. Until the end of the twentieth century, neither medical ethics nor bioethics was focused on terrorism. Ethicists, particularly in the United States, were contending with other matters, many of which were born of biomedical technological advances. Some of these issues, including embryonic stem-cell research, organ transplantation, and physician-assisted suicide, continue to be subjects of intense national debate. But the attacks on September 11, 2001 , and the subsequent release of anthrax spores in the US mail also heightened concerns about national security. Resulting policies about ethnic profiling, wiretapping, and other intrusions into privacy and individual liberty have been contentious. Concerns about terrorism have also prompted debates about the limits of scientific inquiry. For example, the recent synthesis of the poliovirus suggests that potential bioagents like the smallpox virus could be fabricated as well. The scientific community remains divided about whether such research should be prohibited, or publication about it restricted, in the name of national security. 12 In the medical and bioethical arenas, terrorism has also highlighted areas of tension between the rights of individuals and needs of the larger society. Striving for balance between individual liberties and public security is a perennial challenge in every civic society. But the anticipation of numerous casualties from terrorism or other disasters has generated particular concern about the propriety of medically connected coercion in mass casualty events. 13 Four prominent examples include the relationship of ethics to quarantine, vaccination, triage, and the responsibilities of healthcare workers. Through most of the twentieth century, the right of government officials to impose quarantine for health purposes, though periodically challenged, was generally accepted and upheld in the courts of law. 14 But in recent decades, reflecting the cultural shift toward individual rights, many bioethicists began to view quarantine as an inappropriate infringement. Then, following the 9/11 and anthrax attacks, the Center for Law and the Public's Health at Georgetown and Johns Hopkins Universities produced a draft of model legislation for states to deal with bioterrorist and other public health emergencies. Titled a Model State Emergency Health Powers Act (MSEHPA), it would empower state and local officials during a public health emergency to appropriate property and require medical tests, vaccinations, and quarantine without due process. 15 The MSEHPA drew criticism from several bioethicists, none more withering than from George Annas, who described its provisions as draconian. "The model act seems to have been drafted for a different age; it is more appropriate for the United States of the 19th century than for the United States of the 21st century. Today, all adults have the constitutional right to refuse examination and treatment, and such a refusal should not result in involuntary confinement simply on the whim of a public health official." 16 Two principal authors of the act, James Hodge and Lawrence Gostin, rejected the notion that "respecting individual civil liberties was an overriding good." Rather, they held that "restraints of civil liberties may be justified by the compelling need to protect public health." 17 Soon after, the issue was tested by real-life experience. In 2002, severe acute respiratory syndrome (SARS) was recognized as a potentially fatal communicable disease caused by a previously unknown virus. An outbreak of SARS began in Canada in February 2003, and by April some 350 people had been infected, 20 of whom had died. Canadian authorities then instituted a quarantine of about 15,000 thousand residents of Toronto, the center of the outbreak. 18 This was the first imposition of widespread quarantine measures in North America in more than 50 years. Quarantined persons were instructed not to leave their homes or have visitors, to wash their hands frequently, wear masks when near other household members, take their temperature twice daily, and not share personal items like towels or drinking cups. The median duration of quarantine was 10 days. In a survey soon after their quarantine ended, some 30% of respondents revealed symptoms of depression or posttraumatic stress disorder. More than half felt they received inadequate information about home infection control measures. But only 15% believed they should not have been placed in quarantine. 19 By the time of the last recorded Canadian case, in July, 44 victims had died and the effectiveness of the quarantine remained unclear. But soon after, a preliminary study for the Toronto Board of Health concluded that while quarantine did not eliminate SARS, "it was effective in reducing transmission by about 50% for the closest community contacts." 20 Following the Canadian experience, Ross Upshur, a medical ethicist, noted the divide between those who consider quarantine an unwarranted diminution of personal liberty, and those who deem it important for disease control. Still, in the midst of the outbreak and afterwards, few Canadians objected to the government's decision. Upshur expressed the prevailing view that public health officers should err on the side of safety rather than risk exposure to a preventable disease. 21 Confinement during the Canadian experience was for a limited duration. Moreover, potential management complications were not evident, such as issues concerning the availability of food for the confined individuals and provision of care for their children. Under similar conditions, the implications for future manmade or naturally occurring disease outbreaks seem hardly in doubt. Despite objections to imposed quarantine by some academic ethicists, the public would apparently accept such a measure for a prescribed period if health authorities deemed it necessary. The history of vaccinations began with the understanding that people who survived certain diseases became immune to contracting those diseases again. This phenomenon was long recognized in the case of smallpox, where fatality rates from virulent strains could reach 50%. Efforts to protect against smallpox led to crude inoculation practices in several ancient societies. In India, as long ago as 1,000 BCE, the skin of healthy individuals would be cut open for the placement of pus or scabs from people with a mild form of the disease. The procedure, known as variolation, caused an infection that killed perhaps 1% of recipients and made the others very sick. But after recovery they too were no longer susceptible to smallpox reinfection. Variations of the procedure were later practiced in Tibet, China, and eventually in Europe during the eighteenth century. 22 In the late-1700s, the English physician Edward Jenner found that milkmaids who had contracted cowpox, a relatively innocuous disease, were also immune to smallpox. In 1796, he conducted an experiment that today would doubtless have put him in jail. Jenner made an incision in the arm of an 8-year-old boy and injected into it fluid from a cowpox pustule of an infected milkmaid. The boy's arm developed a rash and blisters but left only a small scar. Jenner then injected the boy with pus from a smallpox case, but it produced no symptoms. The experiment demonstrated the protective effect of cowpox infection against smallpox. In time, the genetic make-up of both causative agents -the variola virus (smallpox) and the vaccinia virus (cowpox) -were found to be similar. Nearly a century after Jenner's experiment, the French chemist Louis Pasteur produced an effective vaccine from the actual agent that could cause a disease. In 1881, he showed that livestock could be protected from contracting anthrax if injected with an attenuated form of the anthrax bacterium. This opened the way to the development of vaccines against a range of diseases from cholera and plague to polio. In the course of the twentieth century, vaccinations dramatically reduced the incidence of many health nemeses. Indeed, a global vaccination program against smallpox eradicated the disease in 1979. 23 Compulsory vaccination in the United States and elsewhere extends back to the nineteenth century. Currently every state has a law requiring children to be vaccinated against certain diseases, including diphtheria, measles, rubella, and polio, before enrolling in school. These are among 26 vaccine-preventable diseases for which vaccines are available, according to the Centers for Disease Control and Prevention (CDC). 24 But many states permit exemptions for medical, religious, or philosophical reasons. 25 In recent decades, questions about the safety of some vaccines have prompted more people to seek exemptions for themselves and their children. The trend was accelerated following a 1998 report that suggested a possible link between autism and the vaccine for measles, mumps, and rubella (MMR). A survey in the United Kingdom indicated that before the controversial report was issued, 92% of children there received the MMR vaccine, but 5 years later the figure had fallen to 79%. 26 The trend has generated concern among public health authorities, who question the validity of the purported linkage and have strongly reaffirmed the value of childhood immunizations. Studies have ascribed childhood outbreaks of measles in Philadelphia and whooping cough (pertussis) in Boulder, Colorado, to low vaccination rates in the affected communities. 27 Diphtheria cases in Russia increased from 900 in 1989 to 50,000 in 1994, also attributable to a drop in vaccination rates. Moreover, polio, measles, and childhood meningitis (from Haemophilus influenza type b bacteria) have been nearly eliminated wherever vaccination rates against them are high. Most dramatically, before smallpox was eradicated by immunization, in 1979, it had killed 300 million people in the twentieth century. 28 Still, after the Gulf War began in 2003, the safety of vaccinations again became an issue. Fears of an Iraqi attack with smallpox and anthrax agents prompted a policy of vaccinating military personnel against these diseases. When some troops suffered serious side effects from either vaccine, the program was suspended. Ironically, the biological threat from Iraq later proved to be nonexistent -its biological weapons had evidently been destroyed years earlier. Even healthcare workers who respect the protective value of vaccinations have placed a higher priority on the right to refuse. Thus, the Washington State Nurses Association in 2004 filed suit to prevent a Seattle hospital from mandating flu vaccinations for its personnel. The nurses association sidestepped the contention that an inoculated staff would improve patient safety. Rather the group stated that it supported flu vaccinations, but that the approach to compliance must be through education and not threats. 29 Subsequently, many medical leaders ramped up efforts to underscore the value of immunization. In 2007, the Sabin Vaccine Institute released a statement that strongly supported vaccination programs. Signed by more than 100 leading physicians and medical administrators, and endorsed by the AMA, the statement described "immunization as the safest, most effective way to control and eradicate infectious diseases." 30 Whether such efforts will alleviate the concerns of skeptics remains unclear. In fact, people have refused vaccinations even in the face of dire disease threats. During the global campaign to wipe out smallpox in the 1960s and 1970s, some people in countries where the disease was endemic had to be forcibly inoculated. In 1976, few Americans heeded government warnings to seek vaccination against an anticipated swine flu outbreak. In the end, the epidemic never materialized and several people who were vaccinated suffered serious side effects. The jetliner and anthrax attacks in the United States in 2001 also raised concerns that terrorists might seek to use smallpox as a biological weapon. The following year, President George W. Bush announced a plan to vaccinate 500,000 US healthcare workers, and later up to 10 million firefighters, police, and emergency responders for smallpox. 31 But hundreds of hospitals and thousands of medical personnel refused to comply. By the end of 2004, fewer than 40,000 people had been vaccinated and the program was abandoned. 32 Reluctance was attributed to failure of the government to provide adequate information, though concern about side effects of the vaccine also played a part. (Smallpox vaccinations can cause one or two deaths and a few dozen serious illnesses per million.) Unlike quarantine, vaccinations are invasive and carry a risk, however minimal, of undesirable effects. Thus, compulsory vaccinations are likely to draw more resistance than quarantine. Still the public health benefits of vaccination are indisputable, and they far outweigh the small risk of untoward effects. Resistance to implementing a massive vaccination or prophylactic drug program occurs when a prospective disease outbreak is perceived as unlikely. In this situation, the public is less willing to cooperate than if an outbreak was already evident. An actual outbreak would enhance acceptance of medications as evidenced by public reactions during the anthrax attacks in 2001. Following the first confirmed death from inhalation anthrax, physicians and pharmacists were overwhelmed by demands from the general population for ciprofloxacin and other antibiotics. The challenge is to find a balance in public health policy that can overcome complacency in advance of a possible outbreak, and avoid frenzied overreaction when the outbreak is in progress. The usual connotation of triage, French for "sorting," derives from military medicine. It encompasses a quick assessment and dividing of casualties on the battlefield according to the severity of injury. The concept emerged in the French army during the early 1800s, when the most severely injured were identified and then evacuated for care without regard to rank. 33 Triage has been employed increasingly in American hospitals in recent decades as emergency departments experienced overcrowding. The expanded patient load is attributable in part to the Emergency Medical Treatment and Active Labor Act (EMTLA) of 1986. The EMTLA requires that any patient who arrives at a hospital emergency department must receive a medical screening examination and be provided with emergency care. 34 Treating all patients according to need, and irrespective of insurance or economic status, echoes the early French army approach to triage that disregarded rank. Toward the end of the twentieth century, the threat of terrorism involving weapons of mass destruction became a growing public health concern. A recognized weakness was the absence of guidelines for emergency physicians in the event of a massive biological, chemical, or radiological attack. In such a setting, "triage may bear little resemblance to the standard approach to civilian triage," according to a study on terrorism and ethics. Since treatment might necessarily be denied to some, physicians should not have to make individual triage decisions. Rather, protocols should be formulated based on bioethical decision-making. 35 This concern was intensified in the United States by the 2001 terror attacks, which prompted a gathering of some 40 experts in the fields of bioethics, emergency medicine, health law, and policy. Their meeting was convened by agencies in the Department of Health and Human Services, including the Agency for Health Care Preparedness. The conferees spotlighted the novel challenges posed by mass casualties from "an act of bioterrorism or other … medical emergency involving thousands or even tens of thousands of victims." Their deliberations resulted in a 2005 report titled Altered Standards of Care in Mass Casualty Events. 36 The report proposed sharp deviations from commonly understood ethical conduct. Thus, in mass casualty events, providers may have to reuse disposable supplies, may not have time to obtain informed consent, and could discharge hospital inpatients even if "certain lifesaving efforts may have to be discontinued." Regarding triage, traditional protocols to provide care to the sickest and most injured would not apply. Instead, triage efforts would focus on maximizing the numbers of lives saved. That means giving priority to individuals whose chances of survival are best, and not to the sickest or most injured whose care would require disproportionate attention and scarce supplies. 37 The World Medical Association (WMA) had listed some of these recommendations in a 1994 statement on medical ethics in disaster situations. In 2006, the WMA produced a revised advisory on triage that included separating patients into five categories and then treating them in a hierarchical order. The highest priority would be for patients who could be saved but whose lives are in immediate danger. Next, patients whose lives are not in immediate danger but need urgent care. Third, injured patients who require minor treatment. Fourth, individuals who are psychologically traumatized, but not physically injured. Last, "patients whose condition exceeds the available therapeutic resources, who suffer from extremely severe injuries … and cannot be saved in the specific circumstances of time and place, or complex surgical cases … which would take too long." 38 At about the same time, Ontario health officials along with other medical and ethics specialists, authored a study on the effects of a massive outbreak of a communicable disease. Models of an influenza pandemic indicated that hospital admissions for infected patients could peak at more than 1,800 per day over a 6-week period. Available resources would be inadequate to address everyone's needs. Limits on care would have to be imposed, which, as the study indicated, is a concept foreign to medical systems in developed countries. The study proposed a triage protocol based on numerical scores. A person's score would derive from an aggregation of measurements including those of respiratory function, blood pressure, cardiac condition, neurological responses, organ failure, and age. Depending on the score, a patient would be placed in a category designated by color. Red category patients are the highest priority. These include sick individuals who are more likely to recover if they receive intensive care, though unlikely to recover without such care. Yellow signifies intermediate priority, for very sick patients who may or may not benefit from critical care. They could receive such care if resources are available, though not at the expense of the needs of people in the red category. People in the two remaining color categories would not receive critical care at all. Green covers persons well enough to be treated without intensive intervention. Blue signifies patients who are so ill they should not receive critical care, only palliative measures. 39 How effective this protocol would be in the event of a disease outbreak, whether of deliberate or natural origin, is uncertain. But the study underscores the need to reconfigure traditional understandings of ethics. The authors acknowledge that under normal circumstances, all patients should have a claim to the health care they need. But when faced with bioterrorism or other large-scale disaster, individual rights and needs may be restricted in the interest of the larger community. Certain terrorist acts may generate unique ethical challenges. For example, a terrorist who unintentionally survives a suicide bombing may be critically wounded, as are the people he targeted. Although some of his victims may be less severely injured than the terrorist, would treating them first be morally acceptable? If so, does this mean by extension that in a mass casualty event, some should receive priority treatment apart from severity of injury, but rather based on their age or perceived value to society? Existing ethical codes of healthcare organizations provide little guidance about such questions. The AMA and other health organizations should consider these matters and provide guidance to their constituents. Do physicians and other healthcare workers have a responsibility to treat patients when their own health and lives could be endangered? The current AMA code says physicians have a duty to provide care in emergencies and that responsibility to the patient should be the paramount consideration. 40 The code of the American Dental Association affirms that dentists should regard the benefit of the patient as their primary goal. 41 The American Nurses Association explicitly addresses ethical obligations during a disaster event including a nurse's obligation to protect self with appropriate gear. 42 But as with other healthcare groups, the nurses association's ethics protocols are silent on the matter of refusing to give care in the interest of personal safety. A review of studies in the United States, Canada, Israel, and other Asian countries showed that many healthcare workers would not report to work that might risk their own and their families' safety or health. [43] [44] [45] A survey of US emergency medical technicians indicated variations of willingness to report to work according to the nature of the disaster. Seventy-four percent of the respondents indicated they would show up in the event of a terrorist attack involving chemical or radioactive agents. The number fell to 65% for a smallpox outbreak. 46 Another study, which included physicians and nurses as well as emergency technicians, found that commitment to work in a disaster scene was as high as 84% and as low as 18%, depending on the perceived danger. 47 But the basic finding of these studies was consistent: in the event of a terrorist or disaster incident, absenteeism among healthcare workers could be substantial, as projected by the workers themselves. The attitudes expressed in the surveys have been mirrored by actual experience. In the 1980s, when the outbreak of HIV/AIDS began in the United States, several physicians, dentists, and nurses refused to treat patients for fear of becoming infected and transmitting the disease to their families. Many remained hesitant even after the CDC issued assurances that transmission was unlikely if infection control precautions were taken, such as using latex gloves. During the bioterror attacks after 9/11, when anthrax was released through the US postal system, fearful pathologists were reluctant to perform an autopsy on the first confirmed anthrax victim. 48 The behavior of health professionals was similar during the 2003 outbreak of SARS in Toronto. Many doctors refused to treat patients who were infected with the virus and some resigned from their hospitals rather than face pressure to treat them. The shortage of available physicians prompted the Canadian government to offer temporary medical licenses and two thousand dollars (Canadian) per day to doctors from the United States who would come to help. About 300 US physicians accepted the offer. 49 In the end, some healthcare workers who tended to SARS patients became infected, though the vast majority did not. Infection control measures generally provided adequate protection. Still, the notion of an absolute duty of care without regard to the well-being of self and family is ethically untenable. Of course, by entering the medical profession a doctor agrees to accept elevated risks, such as increased exposure to patients with infectious disease. But refusal to render treatment in certain dangerous environments is not invariably immoral. Daniel Sokol, a British medical ethicist, recounts the experience of a physician who visited the Congo during the 1995 Ebola epidemic. The doctor came upon 30 dying patients amid rotting corpses in a hospital that had been abandoned by the staff. The Ebola virus is highly virulent, communicable, and unresponsive to treatment. In the absence of any remaining palliative medication and of equipment to protect oneself, should the last physician have stayed to offer inevitably futile care? Sokol's conclusion, that abandonment in this case was justifiable, seems defensible. 50 The ultimate challenge is to establish ethical criteria that allow for refusal to work in a dangerous environment. Ignoring the matter, as is now the case with ethical codes of healthcare organizations, invites poorly informed and unwarranted behavior among practitioners. Besides giving rise to the field of terror medicine, the surge of Palestinian attacks against Israelis generated new ethical dilemmas not only to health workers but also to the general population. In 2005, when a suspicious young man approached a shopping mall in the northern Israeli city of Netanya, passersby alerted a nearby policewoman. The man ignored the officer's calls to stop. She drew her gun and began to run after him but was reluctant to shoot amid the crowd of people. At the mall entrance, a security guard tried to apprehend the man, who then detonated a bagful of explosives, killing the guard and four others. 51, 52 Should the police officer have fired her gun despite the possibility of hitting bystanders? What if the man turned out not to have been a terrorist? Shooting the culprit could have saved innocent lives, though the officer's restraint can hardly be condemned. The incident exemplifies excruciating moral dilemmas posed by the threat of terrorism. Other moral choices are more directly related to health and wellbeing. At a forum of the United Nations, a Palestinian spokesman condemned Israel for establishing checkpoints, which delayed ill and pregnant Palestinian women from reaching hospitals. An Israeli representative replied that Israel sought to accommodate people with medical needs, though careful screening was necessary. Between 2000 and 2006, Israeli authorities thwarted suicide attacks by more than 50 Palestinian women. But eight women had succeeded in blowing themselves up and killing scores of Israelis. 53 In fact, in 2002, a young Palestinian woman acknowledged a plan to disguise herself as a pregnant woman and carry out a suicide attack amid a crowd. She was apprehended before putting the plan into action. 54 Later, in June 2005, another woman was found to have explosives under her trousers when she tried to enter Israel from Gaza, an adjacent Palestinian territory. She previously had been treated at Israel's Soroka Hospital in Beer Sheva for burns from a cooking accident in her Gaza home. Now, by her own account, she had hoped to return to the hospital and kill dozens of people there. 55 An additional threat to healthcare workers arose with the discovery of weapons and gunmen in some Palestinian ambulances. As a result, all vehicles, including Israeli ambulances, are searched at the perimeter of a hospital's grounds before they are permitted to reach the entryway. The Israeli Supreme Court rejected a petition by Physicians for Human Rights, which had protested the practice of stopping and searching Palestinian ambulances. Although international law recognizes the neutrality of ambulances and medical personnel, the Court held that the Israeli practice was justified by the actions of Palestinian belligerents. 56 Ordinarily, impeding pregnant women or ambulances from traveling freely should be impermissible. But behavioral codes that normally apply must be reevaluated in the face of a competing moral claim, in this case on behalf of potential victims of terrorism. Michael Gross, an Israeli medical ethicist, laments the upending of traditional medical neutrality, but notes that international law and custom provide no alternative solution to the insurgent behavior. 57 While the medical community and others in Israel have had to face these issues as part of their daily lives, they may seem less pressing elsewhere. Still they merit consideration not only as theoretical exercises, but as templates for policy wherever the threat of terrorism exists. Not every preparedness or protective measure employed by the Israelis is applicable in other societies, but many are. For example, to minimize absenteeism during terror alerts, hospital workers should be assured of priority care for themselves, as is the case in Israel. Along with other first responders they would be quick to receive vaccinations in the event of a biological attack. Work attendance would be further encouraged during prolonged conflict by providing 24-hour onsite nursery care and kindergarten for children whose parents toil elsewhere in the hospital. Numerous other lessons as well can be learned from the Israeli experience. 58 Table 25.1 lists examples of ethical challenges associated with terror medicine. As noted, some items pertain exclusively to terror incidents, and some may apply as well to other disaster events. Incidents of terrorism in recent years have altered approaches to medical care and raised a host of issues concerning medical ethics. The threat of bioterrorism and natural disease outbreaks has revived debates about the propriety of compulsory quarantine and vaccination, and altered priorities during triage. Other challenges have a briefer history, such as dealing with an apparently pregnant woman in need of care, but who may actually be a suicide bomber. Terrorism has also recast the role of the emergency responder who traditionally rushes to the scene of an incident. Nearly simultaneous bombings in the same vicinity have occurred in Israel, and with increasing frequency in Iraq and Afghanistan. A responder must wonder whether a second and third attack might quickly follow the first at the same location. If a responder hesitates to enter the scene is he morally irresponsible? Such questions were hardly considered in the past. But in this era of heightened terrorist threats, ethicists are presented with new issues to contemplate. As the health and medical consequences of terrorism have given rise to the field of terror medicine, so have their moral implications generated concern with an associated new dimension, terror ethics. I swear by Apollo" -on taking the Hippocratic oath Factories of Death: Japanese Biological Warfare 1932-45 and the American Cover Up The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation Bad Blood: The Tuskegee Syphilis Experiment The Search for the Manchurian Candidate: The CIA and Mind Control Advisory Committee on Human Radiation Experiments. The Human Radiation Experiments The Plutonium Files: America's Secret Medical Experiments During the Cold War Rights and dangers: bioterrorism and the ideologies of public health Committee on Research Standards and Practices to Prevent the Destructive Application of Biotechnology. National Institute of Medicine. Biotechnology Research in an Age of Terrorism The Ethics of Coercion in Mass Casualty Medicine Draft for Discussion Prepared by the Center for Law and the Public's Health at Georgetown and Johns Hopkins Universities, for the Centers for Disease Control and Prevention Bioterrorism, public health, and civil liberties Protecting the Public's Health in an Era of Bioterrorism. In: In the Wake of Terror: Medicine and Morality in a Time of Crisis Severe acute respiratory syndrome (SARS) outbreak in Canada. Mapleleafweb SARS control and psychological effects of quarantine Preliminary Results of SARS-Related Public Health Research. Toronto Staff Report to the Board of Health The ethics of quarantine Scourge: The Once and Future threat of Smallpox Smallpox and Its Eradication. Geneva: World Health Organization Report for Congress by the Congressional Research Service A shot in the dark. Guardian Parental resistance to childhood immunizations: clinical, ethical and policy considerations Open Statement on Vaccines. The Sabin Vaccine Institute Washington state nurses sue hospital over mandatory flu vaccination rule. amednews.com Open Statement on Vaccines Source: Bush to announce plan for smallpox vaccinations The Smallpox Vaccination Program: Public Health in an Age of Terrorism. Institute of Medicine The American Heritage Dictionary Terrorism and the ethics of emergency medical care Altered Standards of Care in Mass Casualty Events: Bioterrorism and Other Public Health Emergencies. AHRQ Publication No. 05-0043 Health and medical care delivery in a mass casualty event World Medical Association Statement on Medical Ethics in the Event of Disasters, Revised by the WMA General Assembly Development of a triage protocol for critical care during an influenza epidemic American Dental Association. Principles of Ethics and Professional Conduct American Nurses Association. Position Statement on Work Release During a Disaster. Adopted by the Board of Directors Emergency health care workers' willingness to work during major emergencies and disasters SARS risk perceptions in healthcare workers Fear of severe acute respiratory syndrome (SARS) among health care workers The willingness of U.S. emergency medical technicians to respond to terrorist incidents Will emergency health care providers respond to mass casualty incidents? Prehospital Emerg Care The Anthrax Letters: A Medical Detective Story The doctor's world; behind the mask, the fear of SARS Virulent epidemics and scope of healthcare workers' duty of care. Emerging Infectious Dis Palestinian bomber kills himself and 5 others near Israel mall Economic and Social Council, Commission on the Status of Women Israel Ministry of Foreign Affairs. A woman terrorist, en route to carry out a suicide attack, arrested in Tulkarm Descent from patient to suicide bomber Israel Ministry of Foreign Affairs. Palestinian misuse of medical services and ambulances for terrorist activities Bioethics and Armed Conflict: Moral Dilemmas of Medicine and War Terror: How Israel Has Coped and What America Can Learn 1 . Delaying and searching someone who is in apparent need of medical attention, but who may be a suicide bomber concealing explosives. 2. Stopping ambulances for inspection before they reach a hospital though they may be transporting people in urgent need of care. 3. Expecting responders to rush to an attack scene despite the risk of additional bombs being detonated at that location following the initial attack. 4. Dealing with hospital personnel who are reluctant to part with their frightened children and other family members during an extended attack. 5. Dealing with hospital personnel who do not report to work for fear of harm to themselves or their families from exposure to a biological agent. 6. Treating a critically injured terrorist before (or after) less severely injured victims are treated. 7. Deciding when to reuse disposable supplies such as needles, latex glove, and protective masks. The first six challenges are exclusive to terrorist threats/incidents, and the remaining six apply to both terrorist and other types of disaster events