In a climate where concerns over experiments that create new potentially dangerous pathogens are now amplified due to recent biosafety lapses in high-profile labs, a new group of scientists, Scientists for Science (SFS), emerged this week to assert that research on dangerous pathogens can be (and generally is) conducted safely, that it is adequately regulated, and that such work is critical for public health (the current Ebola virus public health crisis contributes to public attention on these issues). This statement and organization contrasts with and follows the recent statement from the recently formed Cambridge Working Group (CWG) (see here) that called for curtailing certain high-risk pathogen experiments until a thorough cost-benefit analysis is conducted by the scientific community, along the lines of work done by the 1975 Asilomar conference on the risks of recombinant DNA research. The SFS rejects the Asilomar comparison and does not call for limiting such experiments now, but it does call for a more formal review of risks and benefits conducted by an outside expert body, such as the National Academy of Sciences (NAS). From the SFS statement:
Scientists for Science are confident that biomedical research on potentially dangerous pathogens can be performed safely and is essential for a comprehensive understanding of microbial disease pathogenesis, prevention and treatment. The results of such research are often unanticipated and accrue over time; therefore, risk-benefit analyses are difficult to assess accurately.
The CWG and SFS clearly do not agree on the need to halt certain experiments while a more thorough review of cost-benefit parameters for high-risk pathogen research is conducted. However, a consensus is clearly emerging on both sides that public confidence in the need for such experiments as well as continued funding support does require more thorough, expert assessment from outside experts. Neither statement references the National Science Advisory Board for Biosecurity (NSABB), the federal advisory committee that most recently was called on during to assess the publication of gain-of function influenza H5N1 research in 2011. That panel has been recently reshuffled by NIH. A high-level study by the NAS on these issues, therefore, appears welcome to all sides of the current debate. The NAS (through its National Research Council) did undertake a general review of bioterrorism research post-9/11, entitled Biotechnology Research in an Age of Terrorism: Confronting the Dual Use Dilemma (2004) (the Fink Report), which called for the establishment of the NSABB, so there is a certain circularity here.
If we expect to continue to improve our understanding of how microorganisms cause disease we cannot avoid working with potentially dangerous pathogens. In recognition of this need, significant resources have been invested globally to build and operate BSL-3 and BSL-4 facilities, and to mitigate risk in a variety of ways, involving regulatory requirements, facility engineering and training. Ensuring that these facilities operate safely and are staffed effectively so that risk is minimized is our most important line of defense, as opposed to limiting the types of experiments that are done.
In contrast to recombinant DNA research at the time of Asilomar in 1975, studies on dangerous pathogens are already subject to extensive regulations. In addition to regulations associated with Select Agent research, experimental plans on other pathogens are peer reviewed by scientists and funding agencies, and the associated risk assessments are considered by biosafety experts and safety committees. Risk mitigation plans are proposed and then considered and either approved or improved by safety committees.
If there is going to be further discussion about these issues, we must have input from outside experts with the background and skills to conduct actual risk assessments based on specific experiments and existing laboratories. Such conversations are best facilitated under the auspices of a neutral party, such as the International Union of Microbiological Societies or the American Society for Microbiology, or national academies, such as the National Academy of Sciences, USA. We suggest they should organize a meeting to discuss these issues.
An update on the efforts to enact state laws that mandate the labeling of genetically engineered (GE) foods: at present, Connecticut and Maine have enacted conditional GE food labeling laws, which are not to take effect until a requisite number of neighboring states also pass such laws; effectively, these laws are dormant right now. In contrast, Vermont passed a GE food labeling law, Act 120, in April of this year, with its mandates to take effect in July, 2016. Vermont did not follow the conditional model set by the other states, and supporters of the law expected litigation to follow. Since the enactment of this law, Vermont has been sued by the Grocery Manufacturer’s Association and other trade groups, which filed their complaint in June. The complaint alleges a violation of the First Amendment, arguing that the manufacturers will be subject to a form of compelled speech, and as such, this amounts to an impermissible content-based regulation that is unconstitutional. Even accounting for the sometimes more deferential review of speech-related laws that target commercial entities (see 1980 Central Hudson v Public Service Commission), the plaintiffs assert that the state even fails to muster a “substantial government interest,” noting the failure of an earlier Vermont labeling law on dairy products produced from BGH-few animals to pass constitutional scrutiny in International Dairy Foods Association v. Amestoy (2nd. Cir. 1996) (one of the plaintiffs in this current litigation, the International Dairy Foods Association, had challenged the earlier law as well). The complaint against the Vermont GE law further alleges a Fifth Amendment defect pertaining to the vagueness of some terms in the statute, as well as a dormant Commerce Clause violation in view of the extraterritorial effects of the Vermont law on companies based outside Vermont who would be required to “establish Vermont-specific distribution channels.” As this litigation unfolds, efforts continue to mandate GE food labeling in other states. Active legislative efforts are underway in Pennsylvania, New Jersey, Illinois, Massachusetts and New Hampshire. Lastly, voter-initiated ballot measures to mandate GE food labeling will appear this fall in Colorado and Oregon (earlier initiatives in Washington (2013) and California (2012) failed by narrow margins). All of these state efforts arise in the absence of (and official resistance to) any federal scheme for mandatory labeling of GE foods; the FDA imposes no such requirement but does allow voluntary labeling.
A new coalition of scientists, the Cambridge Working Group (CWG), has emerged with the goal of entering into the debate over whether and how scientific experiments that deliberately create new pathogens should be conducted. This class of experiments aims to understand how mutations introduced into the genome of a known pathogen (e.g., H5N1 influenza virus) alter its properties. One of the goals cited for these experiments is to assist public health officials in identifying the emergence of potentially worrisome (viral) strains, with possibly pandemic potential. However, concerns over the safety of these laboratory experiments have been heightened in view of recent high-profile biosafety lapses in government labs (see here). Some of the scientists in the CWG have been members of the National Science Advisory Board on Biosecurity (NSABB), the federal advisory group that advises on biosecurity issues and dual-use research (NSABB has just announced a reshuffling of personnel, replacing almost half the current roster with new members). The controversy of two years ago, where “gain-of-function” (GOF) experiments with H5N1 influenza virus were conducted and then published has continued as new experiments with what have been called “potential pandemic pathogens (PPP)” continued. The class of PPP could constitute de novo constructed viruses or could also include attempts to recreate previously known and dangerous pathogens. Recently, one of the same labs that had published the H5N1 influenza experiments in 2012 reported that it had created a new influenza virus homologous (similar) to the 1918 influenza virus, which caused a pandemic that killed between 20-50 million people (the CDC had already reconstructed the 1918 virus in 2005). This week, the CWG announced its formation and issued this statement:
Recent incidents involving smallpox, anthrax and bird flu in some of the top US laboratories remind us of the fallibility of even the most secure laboratories, reinforcing the urgent need for a thorough reassessment of biosafety. Such incidents have been accelerating and have been occurring on average over twice a week with regulated pathogens in academic and government labs across the country. An accidental infection with any pathogen is concerning. But accident risks with newly created “potential pandemic pathogens” raise grave new concerns. Laboratory creation of highly transmissible, novel strains of dangerous viruses, especially but not limited to influenza, poses substantially increased risks. An accidental infection in such a setting could trigger outbreaks that would be difficult or impossible to control. Historically, new strains of influenza, once they establish transmission in the human population, have infected a quarter or more of the world’s population within two years.
The field of molecular biology previously confronted a scenario where the development of new technologies outpaced a thorough assessment of their potential risks. In a 1974 statement from a committee of the National Academy of Sciences that considered then-emerging recombinant DNA experiments, the concerns expressed about that technology are similar to the current responses to PPP experiments, although today’s concerns apply to the possible enhancement of already known pathogens. That 1974 report stated:
For any experiment, the expected net benefits should outweigh the risks. Experiments involving the creation of potential pandemic pathogens should be curtailed until there has been a quantitative, objective and credible assessment of the risks, potential benefits, and opportunities for risk mitigation, as well as comparison against safer experimental approaches. A modern version of the Asilomar process, which engaged scientists in proposing rules to manage research on recombinant DNA, could be a starting point to identify the best approaches to achieve the global public health goals of defeating pandemic disease and assuring the highest level of safety. Whenever possible, safer approaches should be pursued in preference to any approach that risks an accidental pandemic.
Several groups of scientists are now planning to use this technology to create recombinant DNAs from a variety of other viral, animal, and bacterial sources. Although such experiments are likely to facilitate the solution of important theoretical and practical biological problems, they would also result in the creation of novel types of infectious DNA elements whose biological properties cannot be completely predicted in advance.
The new CWG call for an Asilomar-type approach to evaluating and managing the risks of PPP experiments references the foundational 1975 Asilomar conference called by scientists to deliberate how recombinant DNA experiments could be safety performed. That conference established principles for conducting the new recombinant DNA experiments, leading to the issuance of the 1976 Guidelines by the newly formed Recombinant DNA Advisory Committee (RAC) (see recent post on RAC’s future). It is important to note that there is disagreement within the scientific community regarding the need for GOF influenza (PPP) experiments; see here for a brief on the value of such work and here for a critical take on such research. The call for further investigation of PPP research by the scientific community echoes the approach of Asilomar, but it also serves as a notice to regulatory authorities (e.g., funding agencies) that the scientific community is aware of public concerns and is responding with deliberation. Professional self-regulation could preempt government-initiated controls on PPP research, such as funding restrictions.
Recent developments regarding the biosafety practices in several government laboratories have raised concerns about the containment of potentially dangerous pathogens in scientific research. In the past month, a series of separate incidents exposed weaknesses in the oversight and management of dangerous pathogens. These included the accidental exposure of CDC scientists to anthrax in CDC labs, the discovery of forgotten vials of viable smallpox virus in an FDA lab housed at the NIH, and an unintentional cross-contamination of a benign influenza strain with a dangerous H5N1 influenza and its subsequent transfer. All of these events involve naturally occurring pathogens, but these events also occur at a time when public debate continues over the deliberate creation of potentially dangerous pathogens in the field of dual-use research of concern (DURC). With scientists reporting the creation of new pathogens in order to define what genetic changes correlate with pathogenicity or transmissibility, concerns emerged as to how such scientific detail should be publicly shared, and how such pathogens were to be safely contained in the laboratory environment. Most of the attention focused on the publication of genetic detail, evidencing concerns that the pathogens could be reconstructed for malicious intent. However, an equally serious concern related to the possibility that the newly designed pathogens could be released inadvertently, due to laboratory or personnel errors. This recent series of safety lapses now amplify the concerns over the general state of biosafety practices in laboratories handling the most dangerous pathogens, whether natural or engineered. Although the standards for the containment of dangerous pathogens in laboratories are well-known and generally followed, one untoward release of a high-risk infectious agent could be catastrophic. In general, there are established biosafety protocols which define the type of required facilities, procedures and personnel based on the level of risk that a particular pathogen poses to public health and/or the environment. The guidelines assign a biosafety level (BSL) based on that analysis; the high-containment BSL-3 and BSL-4 labs are required for work with the most dangerous pathogens. In addition, the transport of such pathogens is also managed with protocols that establish safe transfer. The CDC director, Dr. Thomas Frieden, conceded the pattern of biosafety lapses in a press conference and at a Congressional hearing yesterday. Dr. Frieden has shut down several labs and instituted a moratorium on some shipments of pathogens. The Government Accountability Office (GAO) has conducted several studies on laboratory safety in high-containment laboratories (private or public), specifically noting the absence of any overarching federal body to oversee these laboratories, particularly in view of their proliferation in the years since 9/11 (the anthrax incidents), as bioterror-related research has increased (note recent controversy over a BSL-4 lab established in Boston). The GAO was represented at yesterday’s hearing, and it reminded Congress of their previous investigations and recommendations on laboratory safety; the recent incidents may cause Congress to revisit this work and act accordingly.