The FDA is holding two upcoming public workshops in the next several months that consider several areas of genetic test regulation which may be the subject of upcoming FDA actions. The FDA derives authority for any proposed oversight of genetic testing from its general mandate to regulate medical devices; one category of device is the in vitro diagnostic (IVD), a term that generally captures tests and assays that are used in the diagnosis (or treatment) of disease, including genetic tests. The ongoing issue of whether laboratory-derived tests (LDT) should be directly regulated by the agency continues to be unresolved. Because LDTs constitute the majority of commercially available genetic tests in the U.S., the absence of regulation regarding the utility and/or validity of these tests means that most of the genetic tests in the U.S. are not subject to FDA oversight (however, general certification of laboratories does occur under the Clinical Laboratory Improvement Amendments (CLIA). The FDA published its Framework for Regulatory Oversight of Laboratory Developed Tests last fall, and generally proposed a risk-based classification and regulatory structure for LDTs. The agency’s rationale for increasing its involvement in this genetic testing sector was provided:
LDT’s are important to the continued development of personalized medicine, but it is important that in vitro diagnostics are accurate so that patients and health care providers do not seek unnecessary treatments, delay needed treatments, or become exposed to inappropriate therapies.The FDA has generally not enforced premarket review and other applicable FDA requirements because LDTs were relatively simple lab tests and generally available on a limited basis. But, due to advances in technology and business models, LDTs have evolved and proliferated significantly since the FDA first obtained comprehensive authority to regulate all in vitro diagnostics as devices in 1976.
The public workshop for discussion of the LDT framework will be held on January 8-9, 2015; public comments to the online document can still be filed until until February 2, 2015.
The FDA is also considering how it might interface with the field of next generation sequencing (NGS), a term applied to high-throughput methods for generating multiple DNA sequences in parallel; such methods can produce whole-genome or exome-only DNA sequences efficiently and quickly. Typically, the sequencing operation does not start with a particular clinical goal in mind or a particular gene or mutation of interest; the approach is clinically neutral and aims to produce genome-wide DNA sequences. NGS sequencing therefore creates high-volume data that requires serious computational analysis in order to create meaningful, useful results for clinical application. NGS can generate data that reveals rare and previously unknown genetic variants. Because of the broad-brush nature of whole-genome sequencing, it has the potential to reveal incidental (undirected) findings; the ethical management of such information by medical personnel has been the subject of bioethical and academic debate. The FDA has issued a discussion paper which highlights specific concerns and possible standardization for NGS technologies:
NGS tests are unique among existing IVDs in the amount of data that can be generated, the lack of an a priori definition of what will be detected, and the number of clinical interpretations that can be made from a single patient sample. In order to continue to support the development of useful medical information, FDA believes the most efficient possible approaches to regulating NGS tests should be considered. Among the possibilities, a standards-based approach to analytical performance of NGS tests and the use of centralized curated databases containing up-to-date evidence to support clinical performance are under discussion.
The public workshop on NGS regulation will be held on February 20, 2015; the agency will receive comments until March 20, 2015.
In 2013, the Supreme Court considered whether isolated genes qualified as patentable subject matter in AMP v. Myriad (see here for analysis). That case centered on Myriad Genetics' patent claims to isolated BRCA1 and BRCA2 genes which are used to provide genetic testing to detect an increased genetic susceptibility to developing breast and/or ovarian cancer. The Court rejected the patent claims to the isolated genes, noting that:
[S]eparating that gene from its surrounding genetic material is not an act of invention.
After the Court’s opinion was issued, several genetic testing companies immediately moved into the marketplace, offering diagnostic genetic testing for mutations in the BRCA1 and BRCA2 genes (Ambry Genetics and others, see here). At that time, Myriad still held other patent claims that had not been challenged in the earlier litigation. Therefore, following the Supreme Court decision, Myriad promptly filed suit against the new entrants, asserting remaining patent claims to DNA primers (short sequences used to amplify/copy a BRCA1/2 sequence) and to testing methods. Myriad sought a preliminary injunction against Ambry Genetics. In University of Utah Research Foundation et al. v. Ambry Genetics, issued earlier this year, a federal district court denied a preliminary injunction to Myriad, basing its decision on a conclusion that Myriad could not show a reasonable likelihood of success on the merits of the litigation because the asserted patent claims were likely not patentable subject matter. Now, on appeal, the Federal Circuit has upheld the denial of the preliminary injunction, in In re BRCA1-And BRCA2-Based Hereditary Cancer Test Patent Litigation, issued last week. With respect to the patent claims to the primers (short DNA strands that initiate the synthesis of a longer DNA molecule, such as a gene) the court stated:
The primers before us are not distinguishable from the isolated DNA found patent-ineligible in Myriad and are not similar to the cDNA found to be patent-eligible. Primers necessarily contain the identical sequence of the BRCA sequence directly opposite to the strand to which they are designed to bind. They are structurally identical to the ends of DNA strands found in nature.
In this recent litigation, Myriad had also asserted several method claims that captured the basic process of comparing the sequence of a patient’s BRCA1 or BRCA2 genes with a wild-type (normal) DNA sequence of the relevant gene. In its analysis of these claims, the Federal Circuit rejected the claims, not because of a natural phenomenon or law of nature, but rather relying on the “abstract idea” exception to patentable subject matter (this exception most recently discussed by the Supreme Court in Alice Corp. v. CLS earlier this year):
Here, under our earlier decision, the comparisons described in the first paragraphs of claims 7 and 8 are directed to the patent-ineligible abstract idea of comparing BRCA sequences and determining the existence of alterations. The methods, directed to identification of alterations of the gene, require merely comparing the patient’s gene with the wild-type and identifying any differences that arise.
In this second round of litigation over the DNA primers, the Federal Circuit relied on the Supreme Court’s reasoning regarding isolated genes to invalidate the DNA primer claims (the Federal Circuit had initially upheld the patent claims to isolated genes in 2012, only to be reversed by the Supreme Court). With respect to the method claims, the Federal Circuit relied on the abstract ideas exception as it had in its rejection of similar method claims in its 2012 decision preceding the Supreme Court case. This decision is the latest round in Myriad’s attempts to restrict competition in the BRCA1 and BRCA2 genetic testing field through assertion of patent claims, and Myriad has now lost on patent claims to isolated genes, DNA primers, and basic genetic testing methods (only claims to cDNAs were upheld). Ambry hailed the Federal Circuit's decision. Also last week, the United States Patent and Trademark Office (PTO) has issued its 2014 Interim Guidance on Patent Subject Matter Eligibility. This updated guidance follows an unusually high number of Supreme Court cases to examine patentable subject matter in life science, business method and software patents over the last 6-7 years, and the PTO has been publishing its evolving thinking on these issues. The PTO is seeking public comment on the Interim Guidance until March 16, 2015, and it will hold a public forum on these issues in January 2015.
The 2014 midterm elections contained a number of state ballot measures on policy issues involving biotechnology. Not surprisingly, the issue of whether foods containing genetically engineered ingredients should be labeled appeared on two state ballots. In Oregon, Measure 92 was apparently narrowly defeated (50% to 49%) (Oregon had also rejected a similar ballot measure in 2002). However, the narrowness of the vote has now resulted in this week's order of a recount. With respect to Colorado's proposed labeling measure, the vote was not so close: Proposition 105 was defeated 65%-34%. These defeats mean that the current status of state labeling measures is that Vermont has fully passed a labeling law that takes effect in 2016; Connecticut and Maine have also passed labeling laws but their implementation is conditionally linked to a trigger where neighboring states passing similar measures (which has not yet occurred). A second issue with implications for biotechnology on the ballots this month was the issue of fetal personhood: two such initiatives on the ballots in North Dakota and Colorado would have declared personhood to begin at the moment of conception; these and similar measures have been crafted by anti-choice groups in order to elevate the constitutional status of the unborn and effectively criminalize abortion as a result. However, conception-triggered personhood also has implications for the field of human embryonic stem cell (hESC) research: the use of such cells requires their removal from an early-stage embryo, and under a fetal personhood statute, effectively becomes a criminal act against a legally-declared person. As a result, these initiatives have also threatened hESC research. The measures in North Dakota and Colorado were both rejected (by almost identical margins of 65% to 35%). To date, all fetal personhood ballot measures in the states have failed. A third issue on the November ballots was agricultural, relating to the presence of genetically engineered crops: several county-wide ballot measures that would ban the planting and cultivation of genetically engineered crops (on the ballots as "genetically modified organisms") were passed in Humboldt County, California and Maui County, Hawaii. Lastly, a bond measure in Maine to authorize funding
"to discover genetic solutions for cancer and the diseases of aging" passed overwhelmingly. The 2014 elections continued the ongoing attention to GMO labeling and fetal personhood initiatives as the most contentious state-based legislative battles affecting biotechnology.
Several years of turbulence around the legitimacy and value of research that alters the genetics of highly dangerous pathogens in order to determine the relationship between DNA and function are the backdrop to a surprising announcement from the Obama administration regarding government funding of such research. Experiments that introduce genetic changes into the genomes of dangerous pathogens in order to study both transmissibility and pathogenicity may confer new or enhanced functions on a pathogen that make them more dangerous than in the native state. These are termed the “gain-of-function” (GOF) studies, a category of dual-use research of concern (DURC). Controversy ensued in 2011 over the publication of some of these experiments with highly pathogenic H5N1 influenza viruses. New federal policies regarding funding and oversight of DURC life science research were published, but those events presupposed the continuation of such research. Now the administration has issued a moratorium on federal funding of such research, asserting that more consideration is required as to the biosafety and biosecurity issues raised by this research:
Gain-of-function studies, or research that improves the ability of a pathogen to cause disease, help define the fundamental nature of human-pathogen interactions, thereby enabling assessment of the pandemic potential of emerging infectious agents, informing public health and preparedness efforts, and furthering medical countermeasure development. Gain-of-function studies may entail biosafety and biosecurity risks; therefore, the risks and benefits of gain-of-function research must be evaluated, both in the context of recent U.S. biosafety incidents and to keep pace with new technological developments, in order to determine which types of studies should go forward and under what conditions. In light of recent concerns regarding biosafety and biosecurity, effective immediately, the US.Government (USG) will pause new USG funding for gain-of-function research on influenza, MERS or SARS viruses, as defined below. This research funding pause will be effective until a robust and broad deliberative process is completed that results in the adoption of a new USG gain-of-function research policy.
The statement calls on the National Science Advisory Board for Biosecurity (NSABB) and the National Research Council to undertake more formal evaluative reviews of the pros and cons of these experiments. The NSABB held a public meeting last week to begin that work. This announcement did not occur in a vacuum: in the wake of the Ebola virus disease outbreaks and the recent reports of widespread biosafety lapses at high-profile federal laboratories, attention has focused on the status of research into pathogens that cause infectious diseases and into vaccines and drugs for prevention and treatment. The scientific community has been divided over the specific issue of GOF research; both sides have gone public with the formation of professional advocacy groups that support or oppose such research. The control of federal funding for research can be subject to executive branch discretion. An earlier high-profile example of an executive branch dictate is the Bush-era policy of refusing to fund the establishment of new embryonic stem cell lines. This announcement is directed to a pause in funding and the encouragement of what appears to have been missing for the past several years: official acknowledgement that the funding of such controversial research required a formal deliberative process that should precede, not follow, official decisions to allow the work. The statement also calls on those providing private funds for such research to implement a voluntary pause, pending the upcoming reviews.
The FDA is now seeking public comment on a guidance document, Framework for Oversight of Laboratory Tests, that will launch the agency’s entry into a new role regulating a sector of laboratory testing, known as laboratory-derived tests, or LDTs. These tests are offered as services from commercial laboratories, and are developed by the company itself. With respect to the field of genetic testing, LTDs are the predominant form of testing available to the public, where a consumer can send a biological sample to a lab for testing (direct to consumer tests), or provide a sample to a health care provider who mediates the testing process (see here). For years, the FDA has declared that while it has the legal authority to regulated such tests, it has exercised “enforcement discretion" – deciding not to require premarket approval from labs seeking to offer such services. As this marketplace has grown, the agency has moved closer to establishing a more formal role over the quality of these tests, and now has done so with its announcement of how it will require the industry to engage with the FDA in order to offer these tests (see here for the agency's new notification requirements for laboratories). The new proposed framework contains the following observations from the FDA as to why it needs to step up now:
LDT’s are important to the continued development of personalized medicine, but it is important that in vitro diagnostics are accurate so that patients and health care providers do not seek unnecessary treatments, delay needed treatments, or become exposed to inappropriate therapies.
The FDA has generally not enforced premarket review and other applicable FDA requirements because LDTs were relatively simple lab tests and generally available on a limited basis. But, due to advances in technology and business models, LDTs have evolved and proliferated significantly since the FDA first obtained comprehensive authority to regulate all in vitro diagnostics as devices in 1976. Some LDTs are now more complex, have a nation-wide reach and present higher risks, such as detection of risk for breast cancer and Alzheimer’s disease, which are similar to those of other IVDs that have undergone premarket review.
The FDA has identified problems with several high-risk LDTs including: claims that are not adequately supported with evidence; lack of appropriate controls yielding erroneous results; and falsification of data. The FDA is concerned that people could initiate unnecessary treatment or delay or forego treatment altogether for a health condition, which could result in illness or death. The FDA is aware of faulty LDTs that could have led to: patients being over- or undertreated for heart disease; cancer patients being exposed to inappropriate therapies or not getting effective therapies; incorrect diagnosis of autism; unnecessary antibiotic treatments; and exposure to unnecessary, harmful treatments for certain diseases such as Lyme disease.
The House Energy and Commerce Committee Subcommittee on Health held a hearing on the proposed framework last month. The agency has the guidance document open for 120-day public comment period, which will last until February 2, 2015.
The urgency of calls for reliable and scaled public health responses to the magnitude of the international Ebola virus disease (EVD) outbreak is increasing; a media frenzy over the apparent emergence of Ebola in Dallas, TX is underway. At the present time, over 7,000 cases and over 3,300 deaths are attributed to the outbreak concentrated in West Africa, where the virus emerged earlier this year. The crisis requires a combination of public health resources (personnel, facilities, diagnostic capabilities) and effective countermeasures (vaccines, drugs). To date, the most effective drug against the virus appears to be ZMapp, an antibody-based treatment developed by Mapp Biotherapeutical. However, ZMapp supplies are limited, and efforts to scale up production, while necessary, will still not produce enough to meet demand. An expert consultation called by the World Health Organization this week to review vaccine candidates has identified two Ebola vaccines that are ready for Phase I clinical trials. Scheduled milestones for the testing and evaluation of these potential vaccines are now published. Even in a best case scenario, in which a vaccine candidate performs well enough to justify scaleup and distribution, WHO does not anticipate that a significant number of vaccines to be ready before around March, 2015. The WHO meeting participants noted the unprecedented severity and spread of this viral disease relative to other public health crises:
Participants also drew heavily on lessons learned, in the African setting, during trials for candidate malaria, HIV/AIDS, cholera, epidemic meningitis, hepatitis B, and other vaccines. As some experts noted, never again can the international community allow what boils down to “market failure” to create such catastrophic suffering for humanity in any country, in any region of the world. The sense of urgency and need for speed, without compromising the integrity of studies or the quality of their data, are fully justified by the dire situation in affected countries and the risk that other countries may soon experience their first imported cases. The Ebola outbreak currently ravaging parts of West Africa is the most severe acute public health emergency in modern times. Never before in recent history has a biosafety level 4 pathogen infected so many people so quickly, over such a wide geographical area, for so long.
The CDC has already estimated that 1.4 million cases of EVD could emerge by January next year. Case projections from WHO are smaller, but both agencies point out that a near-term ability to reverse the course of EVD will depend on successful public health efforts, including patient isolation and contact avoidance measures. Drugs and vaccines will not be produced fast enough in the short term to meet demand, but the efforts described above are good starts. More generally, as the WHO acknowledged with its reference to “market failure,” it is clear that government expenditures for research on countermeasures need to continue, but the development of a true pipeline from lab to clinic is still lacking. That is a translational failure here: the emergence of rare but deadly outbreaks of diseases like Ebola are not met with stockpiles of drugs. The U.S. Strategic National Stockpile maintains supplies of drugs for use in viral epidemics, but these are directed to the more familiar diseases such as influenza and anthrax. The current Ebola outbreak illustrates the consequences of focusing national resources too narrowly on the usual suspects in bioterrorism or pandemic crises, with the consequence that new or remote pathogens emerging in the U.S. are not met immediately with effective countermeasures that would contain disease outbreaks.
Ebola virus disease (EVD) has appeared in several West African nations over the last several months, and is now spreading with increasing speed. The international public health response has involved World Health Organization (WHO), Centers for Disease Control (CDC), Doctors Without Borders (MSF), and local public health authorities, among others. WHO has now formulated an Ebola response roadmap for the crisis.The history of Ebola virus outbreaks shows the first recognition of the pathogen in 1976, followed by several decades of periodic outbreaks with various virus subtypes. Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Disease (NIAID), placed the current EVD outbreak in historical context:
In most instances, the virus emerged in geographically restricted, rural regions, and outbreaks were contained through routine public health measures such as case identification, contact tracing, patient isolation, and quarantine to break the chain of virus transmission. In early 2014, EVD emerged in a remote region of Guinea near its borders with Sierra Leone and Liberia. Since then, the epidemic has grown dramatically, fueled by several factors. First, Guinea, Sierra Leone, and Liberia are resource-poor countries already coping with major health challenges, such as malaria and other endemic diseases, some of which may be confused with EVD. Next, their borders are porous, and movement between countries is constant. Health care infrastructure is inadequate, and health workers and essential supplies including personal protective equipment are scarce. Traditional practices, such as bathing of corpses before burial, have facilitated transmission. The epidemic has spread to cities, which complicates tracing of contacts. Finally, decades of conflict have left the populations distrustful of governing officials and authority figures such as health professionals. Add to these problems a rapidly spreading virus with a high mortality rate, and the scope of the challenge becomes clear.
To date, at least 3,000 cases and over 1800 deaths have been reported in the West Africa nations of Sierra Leone, Guinea, and Liberia, with numbers rising. Isolated cases are also observed in Nigeria and Senegal. To date, the case fatality rate is estimated at around 60%. No vaccine or effective antiviral is currently available against the virus. An unapproved cocktail of monoclonal antibodies produced by Mapp Biopharmaceutical of San Diego, called ZMapp, was administered to several health workers from the U.S.; their recovery may be partially explained by the use of this drug (although their access raises ethical questions regarding how to allocate scarce countermeasures). That drug essentially transfers an immune response to the patient (passive immunity). The other modalities for EVD treatment and prevention are the more commonly known avenues of vaccines that elicit the patient’s own immune response and/or antiviral drugs which interfere with virus replication. The availability of vaccines and antivirals for “emerging or reemerging diseases” illustrates the deficiencies in matching market realities to public health demands. The research that led to ZMapp was partially funded by the U.S. government as part of its program to establish medical countermeasures against a bioterrorist attack (resources that greatly expanded after 9/11). The supply of ZMapp is limited; HHS is now funding expanded production and formal clinical trials of the drug. In addition, several Ebola vaccines will enter clinical trials soon. But the availability of countermeasures, while necessary, is not the only determinant of how soon the outbreak (not yet called a pandemic) will be contained. The international public health infrastructure, ideally coordinated by WHO, is dependent on funding from national governments, and mandated funding has declined over the years, undercutting WHO's capabilities. WHO did not declare a public health emergency until August, despite the fact that cases began to spread in March. While more vaccines and antivirals can make a difference in any viral disease outbreak, the spread of EBV could have been managed with a more robust public health emergency response earlier this year. MSF has called for countries with biological disaster response teams (e.g., U.S.) to send these personnel to the regions to augment field hospitals, diagnostic laboratories, and other facilities needed to manage the crisis.
In a move that will significantly impact the field of genetic testing, the FDA has notified Congress that it intends to issue a formal draft guidance that will detail the agency’s plan for formal regulation of laboratory-derived tests (LDTs). LDTs are biochemical or genetic tests that are offered as services by commercial laboratories, whether to medical personnel or directly to consumers (DTC). Over the years, the FDA has sent mixed signals over its regulatory posture for these tests, which constitute the majority of commercially available genetic tests offered in the U.S. (an estimated 11,000 tests offered by 2,000 laboratories). Now, in letters sent to the Senate Committee on Health, Education, Labor and Pensions and the House Committee on Energy and Commerce, the FDA announced that the draft guidance, Framework for Regulatory Oversight of Laboratory Developed Tests (LDTs), will be published within 60 days. The FDA defines LDTs as medical devices, falling within the subset of devices known as in vitro diagnostics (IVDs). As medical devices, the LDTs are subject to the agency’s existing authority under the 1976 Medical Device Amendments (MDA) to regulate such items. To date, the FDA has asserted that it exercised “enforcement discretion” for LDTs – which generally meant no regulation. That will now change. The FDA will design a risk-based classification system for LDTs (Class I-III), which parallels the existing medical device regulatory structure. For the highest-risk LDTs (Class III), the FDA will require premarket approval, phasing that requirement in over four years, while existing tests stay on the market. Moderate-risk LDTs (Class II) will be subject to registration, listing and adverse reporting requirements. The FDA will regard companion diagnostic tests, genetic tests that are used in tandem with an approved therapeutic drug to assess patient suitabiltity (e.g., the genetic test for the HER-2 gene that determines whether Herceptin should be administered to breast cancer patients) as high-risk Class III devices. The FDA describes the factors that will be used to assess LDT risk and classification:
FDA will rely upon the existing medical device classification system to evaluate the risk of a category of LDTs and, informed by the industry’s expressed interest in participating in the discussion of the classification process, will use expert advisory panels to help classify devices not previously classified by FDA, as appropriate. In determining the risk an LDT poses to the patient and/or the user, FDA will consider several factors including whether the device is intended for use in high risk disease/conditions or patient populations, whether the device is used for screening or diagnosis, the nature of the clinical decision that will be made based on the test result, whether a physician/pathologist would have other information about the patient to assist in making a clinical decision (in addition to the LDT result), alternative diagnostic and treatment options available to the patient, the potential consequences/impact of erroneous results, number and type of adverse events associated with the device, etc.
Risk will correlate with the likelihood that a genetic test result will deliver information that will be used by a patient to make significant medical decisions (e.g., as illustrated by a BRCA1/2 genetic test result that some patients rely on to elect prophylactic mastectomy based on breast cancer risk). Although the FDA’s move is not a complete surprise, it will significantly alter the business landscape for the LDT genetic testing industry as it contends with formal approvals and regulatory compliance measures for lab tests that have been or will be developed. Not all stakeholders are pleased with the FDA decision. The American Clinical Laboratory Association (ACLA) represents the nation's leading providers of clinical laboratory services and filed a citizen petition in 2013 with the FDA, asking it to refrain from imposing new regulations on LDTs, asserting that existing regulations are adequate; the FDA denied the request. Just last month, a coalition of academic lab directors filed a statement of opposition with the Office of Management and Budget (OMB), disputing the FDA’s jurisdiction and alleging that new regulations on LDTs would stifle the innovate environment that has produced the thousands of LDTs already available. The FDA will proceed on its schedule, as announced, and public comments will be sought and public hearings will be held. The industry was braced for the FDA's action: a leading genetic test provider, 23andMe, had already anticipated the FDA moves and initiated its own regulatory relationship with the agency.
In a climate where concerns over experiments that create new potentially dangerous pathogens are now amplified due to recent biosafety lapses in high-profile labs, a new group of scientists, Scientists for Science (SFS), emerged this week to assert that research on dangerous pathogens can be (and generally is) conducted safely, that it is adequately regulated, and that such work is critical for public health (the current Ebola virus public health crisis contributes to public attention on these issues). This statement and organization contrasts with and follows the recent statement from the recently formed Cambridge Working Group (CWG) (see here) that called for curtailing certain high-risk pathogen experiments until a thorough cost-benefit analysis is conducted by the scientific community, along the lines of work done by the 1975 Asilomar conference on the risks of recombinant DNA research. The SFS rejects the Asilomar comparison and does not call for limiting such experiments now, but it does call for a more formal review of risks and benefits conducted by an outside expert body, such as the National Academy of Sciences (NAS). From the SFS statement:
Scientists for Science are confident that biomedical research on potentially dangerous pathogens can be performed safely and is essential for a comprehensive understanding of microbial disease pathogenesis, prevention and treatment. The results of such research are often unanticipated and accrue over time; therefore, risk-benefit analyses are difficult to assess accurately.
The CWG and SFS clearly do not agree on the need to halt certain experiments while a more thorough review of cost-benefit parameters for high-risk pathogen research is conducted. However, a consensus is clearly emerging on both sides that public confidence in the need for such experiments as well as continued funding support does require more thorough, expert assessment from outside experts. Neither statement references the National Science Advisory Board for Biosecurity (NSABB), the federal advisory committee that most recently was called on during to assess the publication of gain-of function influenza H5N1 research in 2011. That panel has been recently reshuffled by NIH. A high-level study by the NAS on these issues, therefore, appears welcome to all sides of the current debate. The NAS (through its National Research Council) did undertake a general review of bioterrorism research post-9/11, entitled Biotechnology Research in an Age of Terrorism: Confronting the Dual Use Dilemma (2004) (the Fink Report), which called for the establishment of the NSABB, so there is a certain circularity here.
If we expect to continue to improve our understanding of how microorganisms cause disease we cannot avoid working with potentially dangerous pathogens. In recognition of this need, significant resources have been invested globally to build and operate BSL-3 and BSL-4 facilities, and to mitigate risk in a variety of ways, involving regulatory requirements, facility engineering and training. Ensuring that these facilities operate safely and are staffed effectively so that risk is minimized is our most important line of defense, as opposed to limiting the types of experiments that are done.
In contrast to recombinant DNA research at the time of Asilomar in 1975, studies on dangerous pathogens are already subject to extensive regulations. In addition to regulations associated with Select Agent research, experimental plans on other pathogens are peer reviewed by scientists and funding agencies, and the associated risk assessments are considered by biosafety experts and safety committees. Risk mitigation plans are proposed and then considered and either approved or improved by safety committees.
If there is going to be further discussion about these issues, we must have input from outside experts with the background and skills to conduct actual risk assessments based on specific experiments and existing laboratories. Such conversations are best facilitated under the auspices of a neutral party, such as the International Union of Microbiological Societies or the American Society for Microbiology, or national academies, such as the National Academy of Sciences, USA. We suggest they should organize a meeting to discuss these issues.
An update on the efforts to enact state laws that mandate the labeling of genetically engineered (GE) foods: at present, Connecticut and Maine have enacted conditional GE food labeling laws, which are not to take effect until a requisite number of neighboring states also pass such laws; effectively, these laws are dormant right now. In contrast, Vermont passed a GE food labeling law, Act 120, in April of this year, with its mandates to take effect in July, 2016. Vermont did not follow the conditional model set by the other states, and supporters of the law expected litigation to follow. Since the enactment of this law, Vermont has been sued by the Grocery Manufacturer’s Association and other trade groups, which filed their complaint in June. The complaint alleges a violation of the First Amendment, arguing that the manufacturers will be subject to a form of compelled speech, and as such, this amounts to an impermissible content-based regulation that is unconstitutional. Even accounting for the sometimes more deferential review of speech-related laws that target commercial entities (see 1980 Central Hudson v Public Service Commission), the plaintiffs assert that the state even fails to muster a “substantial government interest,” noting the failure of an earlier Vermont labeling law on dairy products produced from BGH-few animals to pass constitutional scrutiny in International Dairy Foods Association v. Amestoy (2nd. Cir. 1996) (one of the plaintiffs in this current litigation, the International Dairy Foods Association, had challenged the earlier law as well). The complaint against the Vermont GE law further alleges a Fifth Amendment defect pertaining to the vagueness of some terms in the statute, as well as a dormant Commerce Clause violation in view of the extraterritorial effects of the Vermont law on companies based outside Vermont who would be required to “establish Vermont-specific distribution channels.” As this litigation unfolds, efforts continue to mandate GE food labeling in other states. Active legislative efforts are underway in Pennsylvania, New Jersey, Illinois, Massachusetts and New Hampshire. Lastly, voter-initiated ballot measures to mandate GE food labeling will appear this fall in Colorado and Oregon (earlier initiatives in Washington (2013) and California (2012) failed by narrow margins). All of these state efforts arise in the absence of (and official resistance to) any federal scheme for mandatory labeling of GE foods; the FDA imposes no such requirement but does allow voluntary labeling.
A new coalition of scientists, the Cambridge Working Group (CWG), has emerged with the goal of entering into the debate over whether and how scientific experiments that deliberately create new pathogens should be conducted. This class of experiments aims to understand how mutations introduced into the genome of a known pathogen (e.g., H5N1 influenza virus) alter its properties. One of the goals cited for these experiments is to assist public health officials in identifying the emergence of potentially worrisome (viral) strains, with possibly pandemic potential. However, concerns over the safety of these laboratory experiments have been heightened in view of recent high-profile biosafety lapses in government labs (see here). Some of the scientists in the CWG have been members of the National Science Advisory Board on Biosecurity (NSABB), the federal advisory group that advises on biosecurity issues and dual-use research (NSABB has just announced a reshuffling of personnel, replacing almost half the current roster with new members). The controversy of two years ago, where “gain-of-function” (GOF) experiments with H5N1 influenza virus were conducted and then published has continued as new experiments with what have been called “potential pandemic pathogens (PPP)” continued. The class of PPP could constitute de novo constructed viruses or could also include attempts to recreate previously known and dangerous pathogens. Recently, one of the same labs that had published the H5N1 influenza experiments in 2012 reported that it had created a new influenza virus homologous (similar) to the 1918 influenza virus, which caused a pandemic that killed between 20-50 million people (the CDC had already reconstructed the 1918 virus in 2005). This week, the CWG announced its formation and issued this statement:
Recent incidents involving smallpox, anthrax and bird flu in some of the top US laboratories remind us of the fallibility of even the most secure laboratories, reinforcing the urgent need for a thorough reassessment of biosafety. Such incidents have been accelerating and have been occurring on average over twice a week with regulated pathogens in academic and government labs across the country. An accidental infection with any pathogen is concerning. But accident risks with newly created “potential pandemic pathogens” raise grave new concerns. Laboratory creation of highly transmissible, novel strains of dangerous viruses, especially but not limited to influenza, poses substantially increased risks. An accidental infection in such a setting could trigger outbreaks that would be difficult or impossible to control. Historically, new strains of influenza, once they establish transmission in the human population, have infected a quarter or more of the world’s population within two years.
The field of molecular biology previously confronted a scenario where the development of new technologies outpaced a thorough assessment of their potential risks. In a 1974 statement from a committee of the National Academy of Sciences that considered then-emerging recombinant DNA experiments, the concerns expressed about that technology are similar to the current responses to PPP experiments, although today’s concerns apply to the possible enhancement of already known pathogens. That 1974 report stated:
For any experiment, the expected net benefits should outweigh the risks. Experiments involving the creation of potential pandemic pathogens should be curtailed until there has been a quantitative, objective and credible assessment of the risks, potential benefits, and opportunities for risk mitigation, as well as comparison against safer experimental approaches. A modern version of the Asilomar process, which engaged scientists in proposing rules to manage research on recombinant DNA, could be a starting point to identify the best approaches to achieve the global public health goals of defeating pandemic disease and assuring the highest level of safety. Whenever possible, safer approaches should be pursued in preference to any approach that risks an accidental pandemic.
Several groups of scientists are now planning to use this technology to create recombinant DNAs from a variety of other viral, animal, and bacterial sources. Although such experiments are likely to facilitate the solution of important theoretical and practical biological problems, they would also result in the creation of novel types of infectious DNA elements whose biological properties cannot be completely predicted in advance.
The new CWG call for an Asilomar-type approach to evaluating and managing the risks of PPP experiments references the foundational 1975 Asilomar conference called by scientists to deliberate how recombinant DNA experiments could be safety performed. That conference established principles for conducting the new recombinant DNA experiments, leading to the issuance of the 1976 Guidelines by the newly formed Recombinant DNA Advisory Committee (RAC) (see recent post on RAC’s future). It is important to note that there is disagreement within the scientific community regarding the need for GOF influenza (PPP) experiments; see here for a brief on the value of such work and here for a critical take on such research. The call for further investigation of PPP research by the scientific community echoes the approach of Asilomar, but it also serves as a notice to regulatory authorities (e.g., funding agencies) that the scientific community is aware of public concerns and is responding with deliberation. Professional self-regulation could preempt government-initiated controls on PPP research, such as funding restrictions.
Recent developments regarding the biosafety practices in several government laboratories have raised concerns about the containment of potentially dangerous pathogens in scientific research. In the past month, a series of separate incidents exposed weaknesses in the oversight and management of dangerous pathogens. These included the accidental exposure of CDC scientists to anthrax in CDC labs, the discovery of forgotten vials of viable smallpox virus in an FDA lab housed at the NIH, and an unintentional cross-contamination of a benign influenza strain with a dangerous H5N1 influenza and its subsequent transfer. All of these events involve naturally occurring pathogens, but these events also occur at a time when public debate continues over the deliberate creation of potentially dangerous pathogens in the field of dual-use research of concern (DURC). With scientists reporting the creation of new pathogens in order to define what genetic changes correlate with pathogenicity or transmissibility, concerns emerged as to how such scientific detail should be publicly shared, and how such pathogens were to be safely contained in the laboratory environment. Most of the attention focused on the publication of genetic detail, evidencing concerns that the pathogens could be reconstructed for malicious intent. However, an equally serious concern related to the possibility that the newly designed pathogens could be released inadvertently, due to laboratory or personnel errors. This recent series of safety lapses now amplify the concerns over the general state of biosafety practices in laboratories handling the most dangerous pathogens, whether natural or engineered. Although the standards for the containment of dangerous pathogens in laboratories are well-known and generally followed, one untoward release of a high-risk infectious agent could be catastrophic. In general, there are established biosafety protocols which define the type of required facilities, procedures and personnel based on the level of risk that a particular pathogen poses to public health and/or the environment. The guidelines assign a biosafety level (BSL) based on that analysis; the high-containment BSL-3 and BSL-4 labs are required for work with the most dangerous pathogens. In addition, the transport of such pathogens is also managed with protocols that establish safe transfer. The CDC director, Dr. Thomas Frieden, conceded the pattern of biosafety lapses in a press conference and at a Congressional hearing yesterday. Dr. Frieden has shut down several labs and instituted a moratorium on some shipments of pathogens. The Government Accountability Office (GAO) has conducted several studies on laboratory safety in high-containment laboratories (private or public), specifically noting the absence of any overarching federal body to oversee these laboratories, particularly in view of their proliferation in the years since 9/11 (the anthrax incidents), as bioterror-related research has increased (note recent controversy over a BSL-4 lab established in Boston). The GAO was represented at yesterday’s hearing, and it reminded Congress of their previous investigations and recommendations on laboratory safety; the recent incidents may cause Congress to revisit this work and act accordingly.
New developments in the question of what subject matter is eligible for patenting illustrate how this seemingly foundational question for the high-tech fields of biotechnology and software remains unsettled, even decades after these industries entered the commercial sector. Patenting for these technologies must not run afoul of the long-standing prohibition on patenting natural phenoma, laws of nature and abstract ideas. Recent years have seen the landmark cases of AMP v. Myriad (2013) and Mayo v. Prometheus (2012) issue from the Supreme Court and impact the life science sector. Myriad took on the eligibility of product claims that derived from natural substances or chemicals (e.g., genes/isolated DNA) and found them invalid. Mayo considered the eligibility of method claims where a natural correlation or relationship was embedded in the claim, therefore risking the possibility that the patent claim “preempts” the use of the natural correlation, and invalidated a method claim for optimizing drug dosage because it violated the prohibition on patenting laws of nature. Following these cases, the U.S. Patent and Trademark Office (PTO) issued new subject matter guidance for patent examiners in applying these holdings to the examination of new patent applications. This guidance document was discussed in an open forum in May of this year, and the PTO has extended the deadline for public comments to July 31, 2014. Last week, at the annual BIO convention in San Diego, the PTO presented a set of model patent claims derived from the discovery of a protein antibiotic and written post-Myriad, and invited further comments. So the question of how life science patenting steers clear of capturing natural, uninvented subject matter continues. In the field of software patenting, a different question continues to be debated. The patenting of computer-based technologies, which allow all manner of operations and processes to be carried out through digitization/computer code, must not cross over into the patenting of abstract ideas. Until this month, the most recent precedent on this point, Bilski v Kappos (2010), disallowed the patenting of a computer-implemented scheme for hedging against the risk of price changes because the Court declared that it was no more than an abstract idea. Just two weeks ago, the Supreme Court issued Alice Corporation Pty. Ltd. v. CLS Bank International, in which patent claims to a computer-implemented scheme for mitigating settlement risk in transactions. The Court declared the patent claims also invalid as they improperly claimed an abstract idea, which is not patentable. The PTO has issued new preliminary examination instructions that apply Alice Corp. to patent examination practice. Both lines of cases (life science and computer-related) reiterate existing precepts of patentable subject matter, but the interpretation of these cases remains unsettled. With the comment process now underway at the PTO on the post-Myriad/Mayo guidelines that extend the logic of the ruling on isolated genes to many other natural substances and products will be revisited, but it's possible that full development of the scope of this exception will again require judicial review.
The need for regulation of genetic testing offered by genomics companies has been debated for years, accompanied by mixed signals from the FDA over its role in such efforts. The bulk of genetic testing services offered by private companies are offered as laboratory-derived tests (LDTs) that are purchased from the test developer. Laboratories providing LDTs are regulated under the Clinical Laboratory Improvement Amendments (CLIA), administered by the Centers for Medicare and Medicaid Services (CMS), which requires that laboratories meet specified standards, and that individual tests are scientifically accurate, but which does not evaluate the clinical validity or clinical utility of LDTs. In addition, direct to consumer (DTC) genetic tests are offered directly to the public and do not require a medical intermediary; results are provided to the client. The industry has long argued that its services were simply LDTs that did not require formal review as medical devices, but simply had to meet the general CLIA standards for clinical laboratories. However, the FDA has recognized that genetic testing may pose special concerns that warrant specific attention and reacted accordingly. In 2010, the FDA sent warning letters to a number of genetic testing companies, including 23andMe, advising them that their genetic services, although direct to consumer, met the classification of a device that required FDA approval as medical devices. A Government Accountability Office (GAO) investigation in 2010 found that several companies provided inconsistent results to undercover consumers, and that the information provided also did not line up with the consumer’s actual clinical status. The Federal Trade Commission, publishes consumer alerts stating that DTC genetic tests may not be reliable and warning that consumers may be deceived by marketing claims.
23andMe has offered several kinds of genetic testing, ranging from tests assessing risk for an individual disease or condition to aggregate testing for hundreds of DNA variants across an individual’s genome. A consumer receives information regarding possibly significant DNA variants identified in her genome, as well as a health report from the company assessing medical risk based on the genetic data. The FDA was most concerned about the company health report because it could form the basis for consumer medical decision-making using possibly weak or non-credible information. In 2012, 23andMe announced that it had filed a premarket notification submission to the FDA for its $99.00 Saliva Collection Kit and Personal Genome Service (PGS), a service offered since 2008. However, the company did not properly respond to the FDA’s inquiries and comments in the period that followed. As a result, in November 2013, the FDA ordered 23andMe to cease offering the (PGS) test product; the company complied and announced that its health reports would be discontinued (only providing "uninterpreted raw genetic data"). Now, the company has announced the filing of a premarket submission for a genetic test for Bloom’s syndrome – indicating a restart to its regulatory path with the FDA and even optimism:
Once cleared, it will help 23andMe, and the FDA, establish the parameters for future submissions. More importantly, for our customers, it marks a baseline on the accuracy and validity of the information we report back to them. The submission includes robust validation data covering major components of our product such as the genotyping chip, software and saliva kit.
As the contours of the regulatory process traveled by 23andMe for its services becomes clear, other personal genetic testing companies are likely to follow suit and the FDA could begin to offer an orderly oversight structure for the products and services offered by this industry. Given the status of 23andMe as a flagship genomics company, and the hesitant moves by the FDA over these several years, the outcome of this submission and review will sketch out a regulatory path for the industry. However, reaction to FDA involvement is mixed; critics contest the level of consumer harm created by these products and argue that consumers should have the right to obtain their genetic data without government interference, noting the First Amendment right to receive information. Future consumers might contest excessive regulation on that basis.
In a move that signals the maturity of the gene therapy field, the National Institutes of Health (NIH) has announced that it will no longer subject all applications for gene therapy trials to automatic review by the Recombinant DNA Advisory Committee (RAC). Gene therapy is defined as:
the transfer of genetic material into humans with the goal of replacing or compensating for the function of abnormal genes, or to enhance the immune system’s ability to attack cancer cells.
RAC occupies a singular place in the history of government oversight of new technologies. The committee was established in 1974, following increasing concern by scientists in the then-emerging field of molecular biology as the techniques involving recombinant DNA were developed and disseminated. The Asilomar conference of 1975 originated with scientIsts, and led to the publication of physical and biological containment strategies to limit the risk of working with recombinant organisms (e.g., bacteria, viruses). RAC issued the first Recombinant DNA Research Guidelines in 1976, and these were the precursor to later guidelines for the gene therapy applications that were first submitted to RAC in the late 1980's. Now, following a study from the Institute of Medicine that called for streamlining the review process for gene therapy (removing redundancies in the review process), the NIH has acceded to their recommendation that RAC reviews of gene therapy be reserved for exceptional cases where both of these conditions exist:
1. The protocol review could not be adequately performed by other regulatory and oversight processes (for example, the institutional review boards, institutional biosafety committees, and the FDA).
In reviewing the history of RAC oversight for the gene therapy field, the IOM stated:
2. One or more of the following criteria are satisfied:
Protocol uses a new vector, genetic material, or delivery method that represents a first-in-human experience, thus representing unknown risk.
Protocol relies on preclinical safety data that were obtained using a new preclinical model system of unknown and unconfirmed value.
Proposed vector, gene construct, or method of delivery is associated with possible toxicities that are not widely known and that may render it difficult for local and federal regulatory bodies to evaluate the protocol rigorously.
When recombinant DNA technology was new, and the many risks concerning individual clinical trial protocols were uncertain, the public, scientists, and policy makers raised important questions about potential dangers—such as whether this technology could harm patients, create new infectious organisms, or make genetic alterations that could be passed down to future human generations. In its report, the IOM committee finds that the major concerns about recombinant DNA from 40 years ago do not raise the same level of concern today, as hundreds of gene therapy clinical trials have evaluated the technique’s safety and effectiveness.
The RAC committee stands as a model of a technology-specific review body set up to augment existing regulatory processes in the case where a novel technology has emerged with potential risks to health and safety. This recent move now becomes a model for partial deregulation of a maturing technology. Gene therapy protocols will continue to be reviewed by the FDA and institutional oversight panels. The IOM report recognizes that this model of regulatory layering still has relevance for current emerging technologies, and specifically cites the field of nanotechnology as a candidate for a future RAC-like review body to consider its specific applications in medicine.
The public health campaign to eradicate smallpox, a highly contagious viral disease with significant mortality, was one of the public health success stories of the 20th century. Following decades of vaccination against the variola major virus, the causative agent of smallpox, the World Health Organization (WHO) declared the eradication of smallpox in 1980 (the last case in the U.S. occurred in 1949, and the last global case was in Somalia in 1977). With this goal achieved, a long-standing concern for international public health authorities has been how to limit and manage the remaining stocks of the smallpox virus. With eradication achieved, should variola virus be retained in laboratories, in view of its characterization as a dual-use microbe – a source of legitimate research interest as a viral pathogen as well as a potential bioweapon? In 1983, by international agreement, a decision was made to retain variola virus stocks at two WHO Collaborating Center laboratories: the State Research Centre for Virology and Biotechnology in Koltsovo, Russia, and in the U.S. at the Centers for Disease Control and Prevention in Atlanta, Georgia. Speculation has continued, however, over whether unaccounted virus stocks exist in other places. The World Health Assembly (WHA), the decision-making conference of the WHO, supported the policy of setting limits on access to variola virus, and, by 1994, endorsed the eventual destruction of the remaining virus stocks. Over the years, WHO surveyed whether consensus research goals required continued maintenance of virus stocks. In 2011, the WHA reaffirmed the eventual destruction of the virus stocks, but deferred the setting of an actual date to the WHA conference in May of this year. Within WHO, the two advisory committees that considered the issue reached different conclusions. At this recent conference, it was decided that the virus stocks will continue to be maintained, because research goals dependent on access to the virus remain unfinished. These goals include the further development of antivirals (smallpox vaccines exist and are maintained in national and international stockpiles). Dissension among the WHO advisory committees contributed to the decision to delay the setting of a date for destruction of the stocks. Not surprisingly, a divergence of views on the merits of retaining variola virus stocks exists in the scientific community (support for maintaining virus stocks for future research versus calls for destroying the virus stocks) as well as among the WHO member states (with U.S. favoring retention while less-developed countries have favored destruction). A further wrinkle in the debate is the emergence of new technical capabilities (e.g., synthetic biology techniques) that could allow artificial reconstruction of the smallpox genome; such a possibility focuses policy concerns away from access to viral stocks and instead on access to genomic and technical information (actual publication). Of course, concerns about informational access have surfaced recently with research reporting the development of highly pathogenic H5N1 influenza viruses (see here and here). The still-unresolved smallpox (variola) virus retention issue illustrates how dual-use research concerns can originate from existing natural pathogens, as well as newly engineered or synthesized pathogens.
Two counties in Oregon recently passed bans on the planting of genetically engineered (GE) crops; the Jackson and Josephine County measures passed handily. The Jackson County ordinance:
It is a county violation for any person or entity to propagate, cultivate, raise or grow genetically engineered plants in Jackson County.
These efforts began in Jackson County where proponents of the ban (local farmers) gathered enough signatures in 2013 to put a local GE ban on the ballot in the spring of 2014. That effort stirred opposition from seed companies and other parties, which reacted by pushing for a legislative fix to preempt such local agricultural governance. The result is that both of these newly-enacted county bans exist against the backdrop of a recently enacted state law that prohibits local counties from interfering with agricultural choice:
A local government may not enact or enforce a local law or measure, including but not limited to an ordinance, regulation, control area or quarantine, to inhibit or prevent the production or use of agricultural seed, flower seed, nursery seed or vegetable seed or products of agricultural seed, flower seed, nursery seed or vegetable seed.
The Oregon county measures are in line with other counties around the country that have enacted such bans (e.g., in Washington, Hawaii, California). In Oregon and the other states with such bans, the rationales advanced for the ban on GE crops include the avoidance of genetic contamination of non-GE crops by neighboring GE crops (e.g., pollen drift). (Several years ago, an apparent genetic contamination of the wheat crop in eastern Oregon elevated concerns about other potential instances of genetic contamination). A further motivation for a GE ban arises from the nature of the engineering itself – these crops are generally engineered for herbicide resistance (Monsanto's Roundup Ready technology) – meaning that the crops can withstand widespread application of these weed-killers. As a result, heavy use of potentially dangerous herbicides is encouraged by the planting of these GE crops, and proponents of such a ban point to the potential environmental and health complications from widespread use of the herbicide. The Oregon counties also have significant numbers of organic farming operations which could face exposure to GE materials or herbicides, scenarios that conflict with established principles of organic farming.
An apparent clash between the new county ordinances (at least for Josephine County) and the state’s new preemption statute looms, lodged against a backdrop of the significant home rule environment for localities that is found in Oregon; Jackson County was granted a waiver from the state law because the ballot initiative had been established before the law was passed. A similar legal showdown could emerge in Hawaii, which has passed a state preemption statute following the enactment of local GE crop bans. The shifting legal landscape between assertions of local governance and reactive preemption is also occurring with respect to state mandates for the labeling of GE food and pending federal preemption efforts to nullify such laws (see earlier post here). In a further sign of Oregon's attention to the legal issues raised by GE crops and GE food, a citizens' effort is underway to get a mandatory GE food labeling law on the ballot for November, 2014.
In its first law enforcement action in the personalized genomics sector, the Federal Trade Commission (FTC) has entered a final consent order against several personal genomics companies for engaging in business practices that deceived consumers. In general, personalized genomics companies follow several business models. A company may offer genetic testing services in which a consumer pays to have her DNA analyzed for mutations that, in the company's claim, are alleged to correlate with various medical conditions or susceptibilities. In another model, genomics companies provide genetic testing and also offer products which are alleged to treat or alleviate the medical conditions identified by the DNA testing. GeneLink, Inc. and its former subsidiary, foru International Corporation, followed the second model and offered what were claimed to be “genetically guided personalization of nutrient and skin care formulations” as part of a general anti-aging portfolio of services and products. The FTC filed a complaint against GeneLink and foru for statements and practices that violate the Federal Trade Commission Act, which prohibits false advertising and “unfair or deceptive trade practices.” The FTC complaint recited promotional materials from GeneLink:
[B]y analyzing and understanding your unique genetic strengths and weaknesses, you can eliminate the guesswork and “genetically guide” the optimal nutritional supplement or skincare formulation to match your LifeMap Healthy Aging Assessment®.
The FTC cited the scope of the claims made by GeneLink:
According to ads and other promotional materials, the supplements could treat serious conditions like diabetes, heart disease, arthritis, and insomnia. Claims for the skin serum cited a “double blind, randomized and controlled study” and promised the product would “compensate for particular deficiencies in areas of skin aging, wrinkling, collagen breakdown, irritation, and the skin’s ability to defend against environmental stress.”
The violation of the FTC Act was recited in the complaint:
12. Through the means described in Paragraph 11, respondents have represented, expressly or by implication, that genetic disadvantages identified through respondents’ DNA Assessments are scientifically proven to be mitigated or compensated for with nutritional supplementation.
13. In truth and in fact, genetic disadvantages identified through respondents’ DNA Assessments are not scientifically proven to be mitigated or compensated for with nutritional supplementation. Therefore, the representation set forth in Paragraph 12 was, and is, false or misleading.
Following a period of public comment, a final consent order was entered to settle the charges brought against the companies. The companies are now prohibited from offering products for purposes not supported by credible scientific data and the level of scientific support required for health-related claims is specified:
“[C]ompetent and reliable scientific evidence” shall consist of at least two adequate and well-controlled human clinical studies.
This FTC action no doubt puts the personalized genomics sector on notice that dubious claims for genetic “treatments” will be subject to FTC monitoring and enforcement actions. This action also exemplifies how the FTC, as the federal consumer protection agency, employs its broad mandate to capture many potentially deceptive business practices in a high-technology areas: the companies were also charged with inadequate data security practices with respect to the collection of consumer information, and the consent order further requires the companies to institute appropriate data security measures for any future data collection. More generally for the genomics sector, the FTC action follows the Food and Drug Administration's (FDA) 2013 warning to 23andme, one of the leading providers of personalized DNA testing, that its services constituted the marketing of an unapproved medical device in violation of the Federal Food, Drug and Cosmetic Act; the company then took corrective actions in removing certain health-related reporting from its products.