By Gabriela A. Weigel*
“They are not ignorant men. Most of them are trained physicians and some of them are distinguished scientists. Yet these defendants, all of whom were fully able to comprehend the nature of their acts, and most of whom were exceptionally qualified to form a moral and professional judgment in this respect, are responsible for wholesale murder and unspeakably cruel tortures.
It is our deep obligation to all peoples of the world to show why and how these things happened. It is incumbent upon us to set forth with conspicuous clarity the ideas and motives which moved these defendants to treat their fellow men as less than beasts. The perverse thoughts and distorted concepts which brought about these savageries are not dead. They cannot be killed by force of arms. They must not become a spreading cancer in the breast of humanity. They must be cut out and exposed for the reason so well stated by Mr. Justice Jackson in this courtroom a year ago–
‘The wrongs which we seek to condemn and punish have been so calculated, so malignant, and so devastating, that civilization cannot tolerate their being ignored because it cannot survive their being repeated.’”
Opening Statement in the Doctors Trial at Nuremberg by Brig. General Telford Taylor
(December 9, 1946)[1]
I. Introduction
It is almost unnecessary to say that with the advancement of science there has come a plethora of ethical dilemmas – dilemmas which lay bare questions about the boundaries of our human interaction. The drive for “progress” and knowledge for the “good of society,” as well as the age old desire for profit and power, continually create a conflict between the further advancement of the human race and respecting whatever meaning and value we ascribe to ourselves individually by virtue of our humanity. We only have to turn to this last century to find case after case – the horrors of the Nazi medical experiments, the shocking revelations by Henry Knowles Beecher of postwar abuses in the United States[2], the Tuskegee Syphilis Study (1932-1972)[3], the Jewish Chronic Disease Hospital study, the Japanese Army’s “Unit 731,” etc.[4] – of cringe-worthy experiments involving human subjects.
Each of these cases elicited a societal backlash and together they have prompted the creation of ethical codes which address and clarify the boundaries of research involving human subjects. In 1948, the Nuremberg Code gave an absolutist, natural law based condemnation of the Nazi experiments. The World Medical Association later issued the Declaration of Helsinki in 1964 and just over a decade later, in 1978, the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research issued the Belmont Report. The “Common Rule” eventually evolved from the Belmont Report and currently governs the regulation of human subject research in the United States. Embedded in each of these ethical codes is the principle of autonomy, primarily safeguarded through the application of informed consent.
However, the application of these ethical codes has been, needless to say, less than easy. Given the inherent conflicts of interest involved in human subjects research (the pursuit of knowledge and profit at the risk of undermining human rights, and by extension, dignity) and the speedy evolution of the nature of clinical research,[5] violations and problems in application still occur.[6] One particular arena affected by these conflicts of interest is the university system, which since the 1950’s has been the locus of federally funded research, particularly in the biomedical sciences.[7] Many of the most notable, and deadly, human subject research studies in the last 60 years have been located in university health and research centers and affiliates.[8] Though initially charged with the academic education of its students, since the advent of the research university system, universities have also been expected to be producers of cutting edge technology as partners working in close relationship with or in easy transferability to industry.[9] These expectations have magnified the conflicts of interest already inherent in human subject research and have raised additional questions about licensing, patents, and the meaning of a university’s academic mission.
This paper seeks to further explore the conflicts of interest inherent in the human subject research situation, particularly in light of the university context. To begin, it relays the historical development of the Common Rule and its emphasis on informed consent in greater detail. Additionally, the historical development of the American research university and its relationship to federal and industrial funding is further fleshed out. The current status of the Common Rule is then evaluated for possible deficiencies in regulating human subject research, particularly in the university setting. Examples of recent ethical violations help to exemplify these deficiencies. Finally, the recent Notice of Proposed Rulemaking issued by the Department of Health and Human Services is examined for what resolutions to the stated issues are proposed and what resolutions are still missing. This paper concludes that though the Notice of Proposed Rulemaking may close some gaps in the coverage of regulation, the focus of the Common Rule and the Notice of Proposed Rulemaking are still behind not only in adapting to the changing nature of clinical research, but more importantly in their premises and focus.
II. History and Context
A. The development of human subjects research regulation
The 20th century ushered in a new type of government regulation, focusing on the ethical use of human subjects in research. The Food and Drug Administration’s (FDA) modern regulatory functions began with the passage of the Pure Food and Drug Act in 1906 (by 1991, the FDA would become a key player in the development and adoption of the Common Rule for regulating their own clinical drug trials and research[10]). Forty years later, an American military tribunal opened criminal proceedings against German physicians and administrators for their participation in war crimes and crimes against humanity. The atrocities of the Nazi experiments, conducted with unwilling human subjects, rocked the world, and the military tribunal responded with the well-known Nuremberg Code, outlining what the tribunal saw as ten basic principles necessary for ethical human subjects research: voluntary consent, fruitful results for the good of society, a basis in animal experimentation, avoidance of unnecessary physical and mental suffering and injury, avoidance of a priori reason to believe death and disabling injury will occur, reasonable degree of risk, proper preparation and adequate facility, limitation to scientifically qualified personnel, human subjects liberty to rescind consent and end the experiment, and the duty of the scientist to rescind an experiment that becomes excessively dangerous.[11]
Despite the American issuance of the Nuremberg Code in 1948, the U.S. Public Health Service conducted the infamous Tuskagee Syphilis Study from 1933-1972, until publicity surrounding the experiment forced the U.S. Department of Health, Education, and Welfare to end it.[12] Within that same timeframe, the Office of Scientific Research and Development (created by President Roosevelt in 1941) conducted dangerous dysentery vaccine experiments on mentally disabled children through the Committee on Medical Research in 1943-44.[13] In addition to these, multiple other U.S. agencies conducted highly hazardous plutonium experiments on unwitting human subjects.[14] Henry Knowles Beecher, in his 1966 article in the New England Journal of Medicine, further showed that much had not changed with regards to research abuses after the war.[15] It was clear that, as the historian David Rotham concluded, “Well into the 1960’s, the American research community considered the Nuremberg findings, and the Nuremberg Code, irrelevant to its own work.”[16]
At the same time, the World Medical Association reacted to growing awareness about the continuing problem of research abuse by developing the Helsinki Codes, I and II, in 1964 and 1975, respectively. These documents reiterated some of the basic tenants laid out in Nuremberg, but called for more specifics. They proposed that consent should be preserved in writing, that clinical research for patient care should be distinguished from clinical research for non-therapeutic purposes, and required that an ethical review committee monitor all research with human subjects.[17] The National Research Act of 1974 also stepped in just before the revision of the Helsinki Code in 1975, responding directly to the publicity surrounding the Tuskagee Syphilis Study by creating the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research, who in turn created the Belmont Report.[18] The Belmont Report would become “the foundational document for the ethics of human subject research in the United States.”[19]
The bedrock principle that can be found in each of these ethical codes is a respect for autonomy, verified through the use of consent. The Nuremberg Code states that “The voluntary consent of the human subjects is absolutely essential,”[20] and although it did not carry the force of law, the Nuremberg Code was the first international document which advocated for voluntary participation and informed consent in human subject research.[21] The World Medical Association followed with the Declaration of Helsinki in 1964[22], outlining recommendations for medical doctors involved in human subjects “research combined with clinical care” and “non-therapeutic [human subjects] research,” reiterating the necessity of informed, voluntary consent.[23] The National Commission for the Protection of Human subjects of Biomedical and Behavioral Research, in issuing the Belmont Report in 1979, outlined three basic ethical principles that should undergird all human subject research conduct: respect for persons (autonomy), beneficence, and justice.[24] Corresponding to each principle was the application of informed consent, assessment of risks and benefits involved in the research, and proper selection of research subjects.[25] A series of regulations following the issuance of the Belmont Report were eventually formally adopted by 17 agencies and the FDA in 1991 as the Federal Policy for the Protection of Human Subjects, or the “Common Rule.”[26] The Common Rule entails the current federal regulations for human subject research, requiring entities receiving federal funding to establish Institutional Review Boards to monitor and approve proper and ethical research and procedures.[27]
Since its inception in 1991, the Common Rule has been subject to little change. It was not until July 2011, after President Obama had issued an Executive Order requiring federal agencies to review and revise burdensome and ineffective significant regulations,[28] that the Department of Health and Human Services (HHS) issued an Advanced Notice of Proposed Rule Making (ANPRM).[29] Researchers welcomed the ANPRM, titled “Human Subjects Research Protections: Enhancing Protections for Research Subjects and Reducing Burden, Delay, and Ambiguity for Investigators,” as a much needed address to the growth and change in the nature of clinical research that had developed since the Common Rule’s inception.[30] After much anticipation, the ANPRM has finally translated into the Notice of Proposed Rulemaking, as proposed by the HHS on September 8, 2015.
B. The history of the American research university
The concept of the research university is barely older than the development of human research ethics and regulation. Beginning in the nineteenth century and notably guided by the ideas of Alexander Von Humboldt, “Research, as an experimental procedure conducted in a spirit of discovery, first appeared in German universities.”[31] By the end of the nineteenth century, the Humboltian model of the research university faced multiple hurdles, including how to properly integrate research and teaching under the same roof, resulting in the delegation by many universities of experimental research activities to in-house institutes of basic research under the aegis of individual professors.[32] However, by the beginning of the twentieth century, North American universities once again took up the ideal of combining and integrating teaching and research, with several institutions successfully asserting themselves as distinguished research universities.[33] One of the most notable of these institutions is Johns Hopkins University, which prides itself on being “America’s first research university.”
The experimental approach now integrated into the university system played a major role in the development of new scientific knowledge and technologies. In realizing this value of the new research university model, and prompted primarily by the post-World War II era concerns, the federal government established a comprehensive policy on the role of the federal government in supporting research.[34] This policy was initialized in Vannevar Bush’s Science, The Endless Frontier in 1945, building an overarching objective of cultivating a “steady stream of scientific knowledge to ensure economic growth.”[35] By 1950, Congress had established the National Science Foundation (NSF) as an agency devoted to the support of basic research and education in all scientific and engineering disciplines. The 1958 launch of Sputnik and the beginning of the space race further propelled U.S. investment in its research universities. Since the end of the Cold War, federal research policies expanded beyond defense-specific research and continued to increase funding for university based research.[36] Expansion included health-related research and basic research in a wide range of other disciplines, with total funding jumping from $31 million in 1940 to $3 billion in fiscal year 1979.[37] However, federal funding began to stagnate around the 1970’s and industry struggles with increasing competition from abroad and slowing productivity at home fostered interest in possible benefits from closer cooperation with universities.[38] The federal government encouraged such cooperation,[39] notably through the Bayh-Dole Act of the Patent and Trademark Amendments of 1980 which invigorated technology transfer from universities to business and industry by transferring federal government patent rights from the results of federally funded research to universities.[40] The Bayh-Dole legislation had a “significant impact,” creating a “compelling incentive for universities and industry to partner in the commercialization of scientific discoveries” and resulting in the number or patents awarded to university faculty to increase fourfold from 1988-2003.[41] Though the federal government remained the dominant source of funds for university research, the amount provided by industries rose since the 1970’s and into the 80’s, focusing especially in the fields of biology, chemistry, and engineering.[42] In recent years, funding has diminished, with funding through the National Institute of Health (NIH) and the National Science Foundation (NSF) being cut by 5% for the 2013 fiscal year[43] and industry funding slowing due to recent economic crisis.[44]
III. Conflicts of Interest and the Limits of the Common Rule
A. The basic conflict
The diminishment of funding sources poses an age old problem for researchers in the university setting: money. A criterion for being called a research university is the significant amount of basic research that must be conducted there, with the measure of that effort being “the magnitude of research grants received by its professional research staff from funding agencies.”[45] It is common knowledge that drawing in an adequate amount of funding is a tenure requirement for many professors in the sciences. The pressure to develop and produce successfully is only intensified by the limitation of resources.
However, the funding issue simply exacerbates what is already an issue in the field of human subject research. Inherent in the relationship between researchers and their subjects is a tension between the goals of the researchers – be it knowledge and the good of society, profit, or merely job security – and the dignity and autonomy of the subject. The Common Rule attempts to regulate these tensions by outlining the primacy and necessity of obtaining and documenting informed consent, establishing requirements for Institutional Review Boards (IRBs) and their analyses, adding protections for specified “vulnerable” classes, and including requirements for assuring compliance by research institutions.[46] Crafting an adequate regulatory scheme is no easy feat, particularly in an area as large and diverse as human subject research, but it is an absolute necessity. Unfortunately, the laws currently in place fall short of the task.
B. Experiments gone awry
The ineffectiveness of IRBs and the current Common Rule system is most starkly felt when death is the result, and particularly when death is the result at a highly regarded institution, expected to be the hallmark of not only successful, but also ethical research. The unfortunate reality is that every experiment involving human subjects will involve some level of risk of death, but when the regulatory system designed to protect against unacceptably high levels of that risk fails to do so, the results can be tragic.
Notable recent examples include Johns Hopkins University and the University of Minnesota. In 2011, Johns Hopkins, who received at the time more federal research money than any other research university at a hefty $310 million in 2010, had almost all of its federally financed medical research suspended after a federal oversight committee investigated the death of young volunteer in a research study at the university.[47] The agency cited a failure by the university’s IRB to take proper precautions to protect the research subjects and required the university to structure a new program to ensure their IRB was properly educated on federal human subject research regulations.[48] The young volunteer died a month after inhaling hexamethonium, an unapproved drug being used to test the causes of asthma.[49]
In an even more disastrous experiment, the University of Minnesota became the subject of public scrutiny after the violent death of a young man taking part in a clinical study at the university became widely publicized. Dan Markingson was a psychotic patient enrolled into a university led study of the drug Seroquel after having been involuntarily committed as a violent threat to himself and others in 2004. Despite his mother’s protests and a previous determination that Dan was incapable of consent, Dan was made to partake in the study for five months until he committed a bloody suicide with a box-cutter.[50] The case of Dan Markingon was particularly egregious not only for concerns with informed consent, but also for the relationship between the university researcher in charge of Dan’s case and AzraZeneca, the drug company seeking to promote Seroquel. It was only after several years of intense prompting by Markingon’s mother and others that the case came to light after having been successfully hushed and hidden from the public, reportedly by university officials.[51] Such an instance more potently demonstrates the conflict of interest that can arise where both the researcher and an industry stand to make a profit.
In less death-ridden cases, the courts have attempted to address the issue of informed consent, though the amount of case law in this area is very limited.[52] (Dan Markingon’s case was dismissed on the grounds of immunity.[53]) In the 2001 case Grimes v. Kennedy, the court addressed the issue of children being enrolled in clinical trials after parents filed suit for negligence in a study conducted by a “prestigious research institute, associated with Johns Hopkins University.”[54] The study was a non-therapeutic research program investigating the effect of different lead abatement procedures for apartments containing lead dust and paint which required children to be exposed to varying levels of lead dust. After the study had been ongoing for some time, several of the children were found to have accumulated hazardously high levels of lead in their blood to the knowledge of the researchers who then failed to notify parents. The Maryland court concluded that parents or other surrogates could not consent to the participation of a child in nontherapeutic research where there is any risk of injury or damage to the health of the subject.[55]
In a 2014 class action against members of the University of Alabama Institutional Review Board, children who had been members of a clinical trial involving research on premature infants with extremely low birth weights filed suit for injury as a result of the study. The clinical trial had two aspects: 1) “exploring treatment with continuous positive airway pressure”; and 2) “determining the appropriate levels of oxygen saturation in extremely low-birth-weight infants by comparing a lower versus higher range of levels of oxygen saturation in such infants.”[56] The plaintiff’s parents contended that they “would not have enrolled the Plaintiffs [in the study] had they been informed of the true risks, benefits, and nature of the [Trial] [sic].”[57] This case granted defendants a motion to dismiss, again highlighting the lack of solid case law dealing with Common Rule violations and also emphasizing issues with the accountability of IRBs, who are often guarded on the grounds of immunity.[58] Where it can be difficult to hold IRBs accountable for failing to adequately safeguard human subjects, a stronger regulatory system is needed upfront.
Some research studies involve bodily harm, but others can be less threatening physically while still giving pause to the conflict of interest at hand. A more recent case involved the University of Maryland and chocolate milk. The University collaborated with industry partners through a program called the Maryland Industrial Partnerships Program, the goal of this collaboration being to foster job creation. The milk manufacturer whose chocolate milk was being tested funded approximately ten percent of the study, which proved to be favorable in showing the manufacturer’s brand of chocolate milk as contributing to improvement in concussion related injuries. Several have called out the University’s study for being “shoddy” and unscientific as the study was released in 2015 without having been published or peer reviewed first. Others point to it as yet another example of the “commercialization” of university research. As one review in BioMed Central put it, “The growing emphasis on commercialization of university research may be exerting unfound pressure on researchers and misrepresenting scientific research realities, prospects and outcomes.”[59]
C. Important tenants of the Common Rule and their limitations
Designed to prevent against disastrous cases, the two most impactful tenants of the Common Rule are the establishment of the IRB and the use of informed consent. In application, however, the Common Rule seems to have collapsed into a system where the IRBs focus primarily on consent forms and significantly less on the other elements highlighted.
The Common Rule specifies that IRB’s focus on seven elements: risk minimization, risk/benefit comparison, equitable subject selection, informed consent, data monitoring to ensure safety, privacy protection and confidentiality, and protection of vulnerable subjects.[60] However, one study of 20 IRB meetings across 10 leading academic medical centers showed that on the low end, only 40% of the IRBs discussed equitable subject selection, whereas on the high end, 98% of IRBs discussed informed consent extensively.[61] The second highest after informed consent was 87% of IRBs having discussed protection of vulnerable populations, with the remaining Common Rule criteria falling between the 40% and 87% discussion rates, the result being a lack of uniformity in application of essential elements of human subjects protection across IRBs.[62]
In addition to problem of inconsistent IRB coverage of the basic protections required in the Common Rule, the adequacy of informed consent has been called into question. Some would say using informed consent as “anything goes so long as there is consent” is a flawed basis[63] and others discuss the difficulty in establishing exactly what informed consent means. While the Common Rule calls for the inclusion of pertinent information,[64] it has regularly been the case that researchers bury important aspects of the research experiment within lengthy and convoluted documents, difficult for the lay person, who is in many cases the research subject, to understand.[65] Also, questions of whether certain classes of persons can consent at all, including children, the mentally disabled, and those particularly vulnerable to coercion, such as prisoners, remain, as do questions about whether it is ethical for such vulnerable persons to have others consent for them.
Essentially, the current limitations of the Common Rule come from the internal nature of the IRBs, their lack of uniformity due to a lack of resources and education, and the inadequacy of informed consent. While ideally human subject research will occur between educated and aware subjects who are given full access to all necessary information to make a legitimately consenting decision, the reality is that informed consent documents are often dense and hide necessary information. This issue is further exacerbated by the fact that IRBs focus entirely too much on informed consent alone and hardly enough on the ethical quality of the research experiment itself and the vulnerability of the subjects taking part in the study. Furthermore, they are often swamped with reviewing a multitude of research projects ongoing at their own institutions. Unfortunately, though, informed consent standing alone, especially as it is currently applied, can hardly safeguard persons from the often deadly risks associated with poorly regulated clinical trials.
IV. The Common Rule Going Forward
A. The Notice of Proposed Rulemaking
On July 2011, the U.S. Department of Health and Human Services (HHS) took a giant step towards the first general overhaul of the Common Rule since it was first issued in 1991 by publishing the Advanced Notice of Proposed Rulemaking (ANPRM).[66] The ANPRM was much awaited by many who believe modernization of the Common Rule is desperately needed but it did not advance anywhere until four years later, on September 8, 2015, when the Notice of Proposed Rule Making (NPRM) made its debut. The Office of Human Research Protections stated that the focus of the NPRM was to elaborate particularly on two of the three key concepts upheld in the Belmont Report, autonomy and beneficence,[67] in addition to the last of the concepts, justice.[68]
In order to achieve this, eight major proposals have been included in the NPRM, some new and some adapted from the ANPRM, which hope to streamline the entire process and give greater protection and control to participating human subjects. The NPRM “sets forth proposals to modify informed consent for biospecimen research, improve the understandability of consent forms, mandate single institutional review board (IRB) oversight of research, and establish data security safeguards.”[69] In addition, the NPRM seeks to extend the scope of the policy to cover all clinical trials conducted as an institution that received federal funding for human subjects research, regardless of their particular funding source.[70] Finally, the NPRM adds exemption and exclusion categories, as well as categories for which continued IRB review throughout the life of the research experiment can be eliminated.[71]
Four of these proposals particularly impact the situation of the research university. The most major (logistically speaking) proposed change has been to almost always require informed consent for the secondary use of biospecimens,[72] regardless of their identifiability.[73] Essentially, the proposal would expand the definition of a human subject to include biospecimens.[74] Previously the definition of human subject for purposes of the common rule only included a living person about whom a researcher obtains personal data or private information that can be connected to the person;[75] de-identified biospecimens were not included. Two alternative proposals call for expanding the definition to either include whole human genome sequencing or to include only certain biospecimens used in particular technologies.[76] Under the NPRM, consent will almost always be needed to conduct research with even de-identified biospecimens. In order to help cover the magnitude of specimens this change would include, the NPRM allows for “broad consent” to be given for the unspecified future use of biospecimens in research, as opposed to specific consent for a specific study. Additionally, the IRB’s ability to waive the consent requirement for the use of biospecimens is further limited under the NPRM.[77] These changes are designed to further the Belmont goal of autonomy, giving persons control over the use of their biospecimens.[78]
The second major proposal the NPRM makes is to simplify the informed consent document. Also in keeping with the goal of enhancing autonomy, the NPRM calls for information to be presented in sufficient detail and organized and presented in a way that facilitates prospective subject’s understanding of the reasons why one might or might not want to participate.[79] One key new feature is the addition of the reasonable person standard, designed to be akin to the legal understanding of the reasonable person.[80] The NPRM emphasizes that essential information that a reasonable person would want to know should be given to prospective participants upfront in consent documents before any other supplemental information is provided.[81] Additionally, supplemental information is encouraged to be organized into an Appendix to further organize and clarify the informed consent document. Finally, to achieve greater transparency, the NPRM calls for a one-time requirement that consent forms of clinical trials be posted on a designated government website sixty days after the recruitment process for the study closes, making the forms accessible to the public eye.[82]
Particularly of interest to university researchers, a third major proposal seeks to bring more clinical research under the Common Rule regulations. It requires U.S. institutions which receive any sort of federal funding for non-excluded, non-exempt human subject research to subject all of their clinical trials to the Common Rule, regardless of any other funding sources. The only exception for this proposal would be clinical trials already subject to FDA regulation.[83] This proposal thus impact not only the universities conducting the research, but also the sponsors who partner with university researchers to conduct clinical trials.
Finally, advantageous to researchers moving between universities, the NPRM calls for only one IRB to review multi-site research conducted at U.S. institutions. Previously, the IRB from each location where the research was being conducted had to independently review the research project. Exceptions to this proposal are made where more than one IRB is required by law or where a federal department or agency determines more than one IRB is needed. As a part of this proposal, independent IRBs will be held directly accountable for compliance with the Common Rule.[84]
B. Analyzing the changes in light of university conflicts
The NPRM is over 130 pages of dense and sometimes vague proposals – covering about eighty-eight different issues within eight major changes – making it difficult even for experts to digest and understand the changes and what impacts they will have.[85] From what can be discerned, four of the eight proposed changes included in the NPRM are specifically related to the issues highlighted in this paper. In sum, they are the proposal to sometimes require consent for certain biospecimens, the proposed reorganization of the informed consent document, the proposal to require only one IRB review research taking place at multiple different U.S. locations, and the proposed expansion of the Common Rule’s coverage regardless of funding source.
The proposal to require consent for the use of certain biospecimens poses a massive logistical problem for researchers everywhere, including at universities. The vast majority of biospecimens are collected during clinical service, rather than specifically for the cause of research. Broad consent upfront helps to blanket all of the biospecimens moving from collection in clinical service to the research arena, but in practice it will make the process more expensive, requiring researchers to track the type of consent and which biospecimen it is linked to.[86] This tracking is key particularly where consent waivers are involved. IRBs have the authority to waive the consent requirements where there is compelling scientific reasons for the use of the biospecimens and the research cannot be conducted with other biospecimens for which consent can be or was obtained. However, IRBs will not be permitted to waive the consent requirement if the individuals providing the biospecimens were asked to give broad consent and declined.[87] Adding additional cost burdens such as these tracking requirements will impose can only contribute to the funding burdens of university researchers.
Next, the proposals look to the consent document to attempt to give human subject participants greater awareness and control over their involvement in the human subject research. The NPRM added the reasonable person standard, stated that information important to the reasonable person should be placed upfront and in a clearly organized manner in the consent document, encouraged that other supplemental information be organized neatly in an appendix, and required that the consent document be posted on a government website once the recruitment period closes. However, these changes not only minimally improve the issue of informed consent, but also create new problems. Nowhere in the NPRM is the definition of the reasonable person expounded upon,[88] causing confusion about what information and even what reading levels satisfy that standard, particularly among different demographics of human subject populations.[89] The Common Rule and the NPRM provide mandatory elements which should be included in the consent form,[90] but some argue not all of these elements are pertinent to different subject populations.[91] Secondly, no templates or specific guidelines are given for what constitutes “clearly organized,” other than a general suggestion to include important elements upfront and supplemental information in an appendix. Finally, the posting requirement is expected to be burdensome without providing any clear benefit. It is a one-time posting requirement, taking place after the recruitment process concludes, meaning it neither benefits those interested in being recruited to participate in the study nor does it provide up to date information once the study commences as the study may be changed or updated as it progresses.[92]
Moreover, some experts complain that the protections of informed consent should not be focused on the document but rather on the process.[93] While some would argue that the move to streamline the document is an effort to reduce paternalistic attitudes in how researchers deal with participants, others would say the move actually enhances those paternalistic notions because ultimately the IRB will still be the only entity to fully review consent documents.[94] Dr. Ross McKinney, Director of Bioethics at the Trent Center, said in response to the NPRM consent form proposals that, “what we were urged to do in Belmont was respect for persons, and autonomy is one element of that respect, but when you present information to people that they cannot understand in a format that they will not bother to go through, you are not respecting persons. The NPRM as best I can see it does nothing to further our respect for persons.” In addition, this development still misses a major factor relating to informed consent, and that is the effectiveness of the IRB’s reviewing the study itself for inherent ethical dilemmas.[95] If the IRBs do not effectively review the consent process in the first place, even the supposed protection of a paternalistic method falls flat.
Furthermore, the NPRM does not discuss who has the final say with regards to the consent documents. This is a major problem for the legitimacy of the consent process because sponsors can and do use this gap to push for the inclusion and exclusion of certain information and formats.[96] Particularly where an IRB is attached to a certain organization rather than functioning independently, they are often placed under enormous pressure to compromise lest the industry sponsor bows out and the researchers are left with inadequate resources and funding.[97] Currently the NPRM and Common Rule do nothing to hold sponsors accountable in the consent process, including having no requiring that sponsors be divulged.[98] This harkens to the NPRM sub-proposal to hold independent IRBs directly accountable for compliance with the Common Rule. The overall proposal to streamline the process for multi-site research by allowing for only one IRB to review is a welcome change because it reduces the existence of conflicting IRB opinions about one research project. Unfortunately, however, the sub-point holding IRBs directly accountable moves the Common Rule further from addressing the real culprit: the sponsors.
Despite this gap, the last of these four proposals, namely the proposal to expand the Common Rule’s impact by subjecting all research at a federally funded institution to the federal regulation, does help to address previously untouched problems. The inherent conflict in human subject research is success and profit versus individual human dignity and safety. Where human subject research is conducted in a competitive market system, the drive for success and profit can easily begin to overcome the rights of the individual human person involved as a human subject. Given that many universities and other research institutions receive at least some portion of their funding from the federal government, this proposal would very quickly expand the scope of the Common Rule to almost all areas of human subject research conducted in the U.S.
C. Going Forward
The period for notice and comment regarding the NPRM closed, after an extension, on January 1, 2016.[99] 2,189 comments have been filed concerning the NPRM, up from the roughly 1,000 comments that were received concerning the ANPRM.[100] From this point, the comments will be addressed by the Department of Health and Human Resources before the release of a final rule which will officially update the Common Rule for the first time since 1991. The final rule is expected to be released sometime in the year 2016.[101]
V. The Unresolved Conflicts of Interest
The unfortunate reality is the NPRM as it stands is a hugely lost opportunity. Technology has been progressing at unprecedented rates, but the Common Rule needs to address more than just the security and privacy issues that have evolved with technology.[102] As one commenter put it, the “Henrietta Lacks concern over commercialization” is being overblown,[103] and the real issues at hand are not being addressed.
Much of the inherent conflicts of interest embedded in human subject research surrounds the idea and implementation of informed consent. Informed consent has positioned itself as the hallmark principle of each of the major ethical codes; Nuremberg, Helsinki, Belmont, and now the Common Rule. As one critic[104] of informed consent noted, consent “is intimately connected to our ideas of ‘liberty’ (I may do what I choose to do, and may refuse to consent to actions in which I do not wish to be involved); ‘equality’ (we all get to consent); ‘autonomy’ (I and only I may make these choices and decisions); and ‘dignity’ (I may make these decisions because of who and what I am).”[105] He continues, “Perhaps because consent is so embedded in our moral thinking, we put it to at least two different tasks. First, consent is a basic fundamental prerequisite of our political and social institutions and of our dealings with one another. We have lost the premodern vision of the world as an organic whole, and so consent, rather than nature or design, structures the coming together, binding together, and living together of modern master-less men. This side of consent animates the political ‘consent theory’ and permeates the rhetoric of the American founding. It is a necessary first condition for the legitimacy of the institution or end-state that proceeds from the act of consenting.”[106] Without proper conducted and adequately informed consent, the validity of researcher actions is highly suspect, and society cries injustice.
Despite our recognition that informed consent is a defining principle in valid human subject research, we have yet to agree on what informed consent exactly is or what it should look like. Currently, the Common Rule places great emphasis on the actual consent document itself, much to the dismay of many well-meaning practitioners in the research field.[107] However, there is little to no empirical research on informed consent and what constitutes an effective method of ensuring the participants in the research study have actual comprehension of the study and the ramifications of being a part of it.[108] Lists of required elements, like the one contained in the Common Rule (which is being added to by the NPRM) are only minimally effective since the focus for IRBs and researcher potentially becomes meeting data requirements rather than improving understanding. What instead needs to be done is to look at informed consent as a process rather than a single document with a signature. The objective, after all, is “to provide adequate information in a dispassionate style so that a reasoned decision about participation can be made.”[109] Ultimately, the informed consent process should be about making sure the participant is fully aware, autonomous, and competent to ensure they can give an authentic response about their desire to participate or not participate in a research study.
Sponsors or industry partners, however, can destroy the validity of informed consent in two ways: by cloaking or omitting certain information from a participant in order to obtain human subjects for their research, or by designing studies that are inherently wrong. Given that the Common Rule and NPRM are silent as to who has the final say in the consent documents, the former is still a very real issue. The latter, however, has been untouched by the NPRM and only minimally discussed by the Common Rule or the researchers who function under it. “The first and most important question is whether the experiment should be done at all.”[110] Richard W. Garnett writes that in every case of human subject experimentation, there are three interests, and not just the two of the researcher and the human subject. That third interest is our interest in preserving human dignity in our community. We as humans, though not necessarily the human subject of the moment, either suffer or benefit from the performance or non-performance of certain experiments.[111]
The Common Rule does seek to protect human subjects from studies that could be seen as inherently wrong or dangerous through the use of the IRB. As we have seen, however, IRBs have not been effective in implementing a holistic understanding of the Common Rule and instead tend to zero in on the informed consent documents.[112] The reality is, responsibility should extend beyond IRB. Sponsors who provide funding, as well as often unwanted pressure to increase chances of success for marketing purposes, must be held accountable, as well as the researchers themselves. The remarkable lack of case law concerning human subject research, contrasted with the relatively high number of even just recent cases of death and injury from poorly run clinical trials, is a testament to the lack of accountability currently in the system.
If anything, the Common Rule should be updated to deal with the changing nature of research relationships, particularly on university campuses. The pressures of funding and achieving success only add to the tension that already exists between the researcher and the human subject, and it is utterly unacceptable that sponsors to human research projects can maintain such a level of power and authority over the consent process – a process which plays such a vital role in the Common Rule’s scheme for the protection of human subjects. Unfortunately, the Common Rule has thus far seemed to limit itself, first by waiting so long to update its regulatory scheme, and second, because the NPRM only adds to already cumbersome and often complicated regulations, rather than providing any guidance to harken back to its roots in the Belmont Report. The principles of justice, beneficence, and respect for persons (or autonomy) have found their place, for good or ill, as the bedrock principles of the United States’ ethical understanding for dealing with human persons, and if they are to remain as such they must be couched in a regulatory system that enhances their understanding and effect rather than diminishing it.
VI. Conclusion
The arena of human subject research is nothing short of complicated, both in the rapidly ever-changing nature of the research conducted and the application of ethical principles to such a fluid field. Isaac Asimov succinctly described this issue when he wrote, “The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.”[113] History has showed us the high stakes that can come with advancing the cause of science through human subject research, from atrocities in Nazi war camps to seemingly preventable deaths in modern clinical trials at prestigious universities like Johns Hopkins. The Common Rule and the most recent NPRM work to prevent some of the negative ramifications of poorly conducted human subject research through outlining new procedures that should further guide the actions of researchers. However, the NPRM is already long overdue and as of yet, is still missing some key components to adequate protection of human subjects in research. A strong, core understanding of the meaning of the informed consent process and a robust structure to deal with the tensions between funding sources and the goodwill of researchers is desperately needed. Without either of these core principles, the conflict inherent in human subject research is hardly dealt with and abuses are sure to continue with more frequency than is acceptable.
The research university is uniquely positioned concerning these issues, not only because it has become the host of a large quantity of modern research, including human subject research, but because it also has always been a place of advancing human wisdom and not just human knowledge. The university campus is where we frequently challenge our notions of human existence, of human trial and suffering, of human joy and prosperity. Where we see the university advancing towards the goals of consumerism and the marketplace rather than the higher goods of man’s dignity and identity, we must build a firm wall. The advancement of human knowledge should not come at the price of human life or human dignity, and universities are distinctively suited to guard those goods as they have always done.
*J.D. Candidate, Notre Dame Law School, 2017
[1] Brig. General Telford Taylor, Opening Statement in the Doctors Trial (1946), http://law2.umkc.edu/faculty/projects/ftrials/nuremberg/doctoropen.html.
[2] Daniel Callahan, What Price Better Health? Hazards of the Research Imperative at 135 (Univ. of Cal. Press 2006).
[3] Mary Faith Marshall, Born in Scandal: The Evolution of Clinical Research Ethics, Sci., Apr. 26, 2002 at, http://goo.gl/rV8Qhv.
[4] Richard W. Garnett, Why Informed Consent? Human Experimentation and the Ethics of Autonomy, 36 Cath. Law. (1996) at 465
[5] Leslie Meltzer Henry, Revising the Common Rule: Prospects and Challenges, 41 J. of Law, Med. & Ethics 386, 386-389 (2013).
[6] Steinbrook R. , Protecting Research Subjects – the Crisis at Johns Hopkins, 2002 New Eng. J. Med. 716-720 (2002); Carl Elliot, Why the University of Minnesota Psychiatric Research Scandal Must Be Investigated, MinnPost, Mar. 28, 2013 at, https://goo.gl/YxD3gI.
[7] Robert LaCroix & Louis Maheu, The Emergence of the Research University, in Leading Research Universities in a Competitive World 3-11 (McGill-Queen’s Univ. Press 2015).
[8] Supra note 6
[9] Michael M. Crow & Christopher Tucker, The American Research University System As America’s De Facto Technology Policy, 1999 (1999).
[10] U.S. Department of Health & Human Services, Federal Policy for the Protection of Human Subjects (‘Common Rule’), http://www.hhs.gov/ohrp/humansubjects/commonrule/
[11] The ten principles of the Nuremberg Code, infra, in full are as follows:
- The voluntary consent of the human subject is absolutely essential. This means that the person involved should have legal capacity to give consent; should be so situated as to be able to exercise free power of choice, without the intervention of any element of force, fraud, deceit, duress, overreaching, or other ulterior form of constraint or coercion; and should have sufficient knowledge and comprehensions of the elements of the subject matter involved as to enable him to make an understanding and enlightened decision. This latter element requires that before the acceptance of an affirmative decision by the experimental subject there should be made known to him the nature, duration, and purpose of the experiment; the method and means by which it is to be constructed; all inconveniences an hazards reasonably to be expected; and the effects upon his health or person which may possibly come from his participation in the experiment. The duty and responsibility for ascertaining the quality of the consent rests upon each individual who initiates, directs, or engages in the experiment. It is a personal duty and responsibility which may not be delegated to another with impunity.
- The experiment should be such as to yield fruitful results for the good of society, unprocurable by other methods or means of study, and not random and unnecessary in nature.
- The experiment should be so designed and based on the results of animal experimentation and a knowledge of the natural history of the diseases or other problem under study that the anticipated results will justify the performance of the experiment.
- The experiment should be so conducted as to avoid all unnecessary physical and mental suffering and injury.
- No experiment should be conducted where there is an a priori reason to believe that death or disabling injury will occur; except, perhaps, in those experiments where the experimental physicians also serve as subjects.
- The degree of risk to be taken should never exceed that determined by the humanitarian importance o the problem to be solved by the experiment.
- Proper preparations should be made and adequate facilities provided to protect the experimental subject against even remote possibilities o injure, disability, or death.
- The experiment should be conducted only by scientifically qualified persons. The highest degree of skill and care should be required through all stages of the experiment of those who conduct or engage in the experiment.
- During the course of the experiment the human subject should be at liberty to bring the experiment to an end if he has reached the physical or mental state where continuation of the experiment seems to him to be impossible.
- During the course of the experiment the scientist in charge must be prepared to terminate the experiment at any stage, if he has probable cause to believe, in the exercise of good faith, superior skill, and careful judgement required of him, that a continuation of the experiment is likely to result in injury, disability, or death to the experimental subject.
Reprinted in Wendy K. Mariner, AIDS Research and the Nuremberg Code, in The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation 286-304 (George J. Annas & Michael A Grodin eds., 1992).
[12] Office of the Vice Chancellor, Research and Economic Development, 2015 (2015), http://ors.umkc.edu/research-compliance-%28iacuc-ibc-irb-rsc%29/institutional-review-board-%28irb%29/history-of-research-ethics.
[13] Supra note 2, at 139
[14] Id. 140
[15] Id. 142
[16] Supra note 2, at 140
[17] Id. 142
[18] Supra note 12
[19] Id.
[20] The Nuremberg Code, in 2 Trials of War Criminals Before the Nuremberg Military Tribunals Under Control Council Law No. 10 at 181-182 (U.S. Gov’t Printing Office 1949), http://www.hhs.gov/ohrp/archive/nurcode.html.
[21] Supra note 12
[22] WMA Gen. Assembly, WMA Declaration of Helsinki – Ethical Principles for Medical Research Involving Human Subjects (1964), http://www.wma.net/en/30publications/10policies/b3/.
[23] Id.
[24] The Nat’l Comm’n for the Prot. of Human Subjects of Biomedical & Behavioral Research & U.S. Dep’t of Health & Human Services, Office of the Sec’y, The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research (1979), http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html.
[25] Id.
[26] Called the “Common Rule” because it is common to 18 departments and agencies, namely the Department of Commerce (15 CFR 27), The Department of Defense (32 CFR 219), the Department of Education (34 CFR 97), the Department of Energy (50 CFR 745), the Department of Health and Human Services (45 CFR 46), the Central Intelligence Agency (45 CFR 46), the Department of Homeland Security (45 CFR 46), the Social Security Administration (45 CFR 46), the Food and Drug Administration (45 CFR 46), the Department of Housing and Urban Development (24 CFR 60), the Department of Justice (28 CFR 60), the Department of Transportation (49 CFR 11), the Department of Veterans Affairs (38 CFR 16), the Consumer Product Safety Commission (16 CFR 1028, the Environmental Protection Agency (40 CFR 26), the Agency for International Development (22 CFR 225), the National Aeronautics and Space Administrations (14 CFR 1230), the National Science Foundation (45 CFR 690) and the Department of Agriculture (7 CFR 1c); The Department of Labor will join for first time under the NPRM.
[27] “The Common Rule,” 45 C.F.R. § 46 (2009).
[28] Supra note 5
[29] Id.; Summary, Doug Lederman, Updating the Common Rule, 2011 Inside Higher Ed (2011) at, https://goo.gl/EvUYpU.
[30] Supra note 5
[31] Supra note 7, at 1
[32] Id. at 4
[33] Supra note 7, at 5
[34] Arden L. Bement Jr. & Angela Phillips Diaz, U.S. Public Research Universitites: A Historical Perspective, Purdue Univ. (2011), at 1.
[35] Id. at 2.
[36] Id. at 4; Judith Jarvis Thomson et al., Academic Freedom and Tenure: Corporate Funding of Academic Research, 69 Academe 18a (1983) available at http://www.jstor.org/stable/40249083.
[37] Academic Freedom and Tenure, supra note 36.
[38] Supra note 37.
[39] Supra note 37.
[40] Supra note 34, at 4
[41] Id. at 5; Jerry G. Thursby & Marie C. Thursby, Faculty Participation in Licensing: Implications for Research, 40 Res. Pol’y 20, 20-29 (2011).
[42] Supra note 37.
[43] Michael Price, Funding Cuts Ravage Academic Laboratories, 2013 Sci. Careers: The Job Market (2013), http://www.sciencemag.org/careers/2013/09/funding-cuts-ravage-academic-laboratories.
[44] Yudhijit Bhattacharjee, Industry Shrinks Academic Support, 312 Sci. (2006), http://www.sciencemag.org.
[45] Supra note 7, at 8
[46] Supra note 27; Supra note 12; Carl H. Coleman, Vulnerability as a Regulatory Category in Human Subject Research, 2009 J. of Law, Med. & Ethics (2009).
[47] G. Kolata, Johns Hopkins Death Brings Halt to U.S.-financed Human Studies, 2001 N.Y. Times A1 (2001).
[48] Id.
[49] Id.
[50] Carl Elliot, “I Was Just Following Orders”: A Seroquel Suicide, a Study Coordinator, and a “Corrective Action”, 2012 Mad in America Science, Psychiatry & Community (2012) at, https://goo.gl/QdWVCU; Carl Elliot, University of Minnesota Blasted for Deadly Clinical Trial, Mother Jones, Apr. 3, 2015 at, http://goo.gl/9aobJb.
[51] Carl Elliot, Why the University of Minnesota Psychiatric Research Scandal Must Be Investigated, MinnPost, Mar. 28, 2013 at, https://goo.gl/YxD3gI.
[52] Grimes v. Kennedy Krieger, 782 A.2d 807 (Md. 2001), see footnote 1.
[53] Supra note 50.
[54] Grimes v. Kennedy, supra note 51.
[55] Grimes v. Kennedy, supra note 51.
[56] Looney v. Moore, (2014), (slip Copy).
[57] Id.
[58] Id.
[59] Julia Belluz, The Incredible Tale of Irresponsible Chocolate Milk Research at the University of Maryland, 2016 Vox, Jan. 16, 2016 at (2016), http://www.vox.com/2016/1/16/10777050/university-of-maryland-chocolate-milk; (citing 2015 review in BioMed Central at http://bmcmedethics.biomedcentral.com/articles/10.1186/s12910-015-0064-2).
[60] Supra note 27, Charles W. Lidz et al., How Closely Do Institutional Review Boards Follow the Common Rule?, 87 Acad. Med., May 22 (2012).
[61] How Closely, supra note 60.
[62] Supra note 61.
[63] See generally, Richard W. Garnett, Why Informed Consent? Human Experimentation and the Ethics of Autonomy, 36 Cath. Law. (1996) at 465
[64] Supra note 37.
[65] Grimes v. Kennedy, supra note 51.
[66] U.S. Department of Health & Human Services, NPRM 2015 – Summary, http://www.hhs.gov/ohrp/humansubjects/regulations/nprm2015summary.html.
[67] Video: Office of Human Research Protections: Webinars on the Notice of Proposed Rulemaking (NPRM) (Sept. 30, 2015) at, https://www.youtube.com/watch?v=ykxT25ze-rg&list=PLrl7E8KABz1FtLMpK2zPa8nqV-F-xhW2C&index=1.
[68] Proposed Rules, Fed. Reg. (Sept. 8, 2015) (to be codified at 45 C.F.R.) at 11.
[69] Ropes & Gray, Alert: HHS Proposes Major Overhaul of the Common Rule, , Sept. 8, 2015 at https://www.ropesgray.com/newsroom/alerts/2015/September/HHS-Proposes-Major-Overhaul-of-the-Common-Rule.aspx
[70] Sweeping Changes Proposed to Common Rule Governing Human Subjects Research, Quorum Blog, Sept. 4, 2015 at, http://www.quorumreview.com/2015/09/04/nprm-2015-summary-post/.
[71] Supra note 67.
[72] Id.; Video: Research Match, Enhancing and Clarifying Consent Forms and Establishing Standard Safeguards (Streamed Live Nov. 18, 2015) at, https://www.youtube.com/watch?v=FBTHKkkOP5E.
[73] Supra note 67.
[74] “Biospecimens are materials taken from the human body, such as tissue, blood, plasma, and urine that can be used for cancer diagnosis and analysis. When patients have a biopsy, surgery, or other procedure, often a small amount of the specimen removed can be stored and used for later research. Once these samples have been properly processed and stored they are known as human biospecimens.” National Cancer Institute, Patient Corner: What Are Biospecimens and Biorepositories?, Biorepositories & Biospecimens Res. Branch, July 28, 2014 at http://biospecimens.cancer.gov/patientcorner/.
[75] Research Match, supra note 72.
[76] Supra note 68, at 15.
[77] Supra note 67.
[78] Id.
[79] Id.
[80] Id.
[81] Supra note 68, at 6.
[82] Id.
[83] Supra note 67
[84] Supra note 68, at 51.
[85] Research Match, supra note 72 (citing Maureen Smith).
[86] Id. (citing Dr. Mark Schreiner).
[87] Supra note 67.
[88] Research Match, supra note 72; Supra note 67 (Only mentioned in this video that it is akin to the legal standard).
[89] Research Match, supra note 72 (citing Jeri Burtchell).
[90] The eight elements currently required in the Common Rule are as follows:
- A statement that the study involves research, an explanation of the purposes of the research and the expected duration of the subject’s participation, a description of the procedures to be followed, and identification of any procedures which are experimental;
- A description of any reasonably foreseeable risks or discomforts to the subject;
- A description of any benefits to the subject or to others which may reasonably be expected from the research;
- A disclosure of appropriate alternative procedures or courses of treatment, if any, that might be advantageous to the subject;
- A statement describing the extent, if any, to which confidentiality of records identifying the subject will be maintained;
- For research involving more than minimal risk, an explanation as to whether any compensation and an explanation as to whether any medical treatments are available if injury occurs and, if so, what they consist of, or where further information may be obtained;
- An explanation of whom to contact for answers to pertinent questions about the research and research subject’ rights, and whom to contact in the event of a research-related injury to the subject; and
- A statement that participation is voluntary, refusal to participate will involve no penalty or loss of benefits to which the subject is otherwise entitles, and the subject may discontinue participation at any time without penalty or loss of benefits to which the subject is otherwise entitled.
Supra note 27.
Elements to be added or changed by the NPRM are as follows:
- The number of subjects to be included in the trial
- Prospective subjects are to be informed that their biospecimens may be used for commercial profit and whether the subject will or will not share in this commercial profit.
- Prospective subjects are to be informed of whether clinically relevant research results, including individual research results, will be disclosed to subjects, and if so, under what conditions.
- Provide subjects or their legally authorized representatives with an option to consent, or refuse to consent, to investigators re-contacting the subject to seek additional information or biospecimens or to discuss participation in another research study.
Research Match, supra note 72 (citing Carson Reider and Dr. Ross McKinney).
[91] Research Match, supra note 72. (For example, a patient population with diabetes would not necessarily need additional information about diabetes treatment or simplified language concerning diabetes since they are already familiar with the disease).
[92] Id. (citing Amy Schwarzhoff).
[93] Id. (citing Dr. Ross McKinney and Dr. Mark Schreiner).
[94] Id.(citing Carson Reider, Dr. Ross McKinney, and Amy Schwarzhoff).
[95] Supra note 60.
[96] Research Match, supra note 72 (citing Carson Reider and Dr. Mark Schreiner, etc.)
[97] Id. (citing Amy Schwarzhoff).
[98] Id. (citing Dr. Ross McKinney).
[99] U.S. Department of Health & Human Services, Federal Policy for the Protection of Human Subjects (2016) (docket Folder), http://www.regulations.gov/#!docketDetail;D=HHS-OPHS-2015-0008.
[100] Supra note 67.
[101] Id.; Research Match, supra note 72/
[102] Supra note 68, at 25, 35, 45.
[103] Research Match, supra note 72 (citing Dr. Ross McKinney)
[104] Richard W. Garnett gives an interesting discussion concerning whether informed consent is sufficient as a regulatory tool and safeguarding principle in the area of human interaction. He writes, “We might block the consented-to action, but we pay lip service to consent’s justifying role by assuring ourselves that had the consent been untainted, had it been ‘informed,’ it would have ha moral force. In fact, we pay lip service precisely because we often slightly suspect that consent cannot and foes not always justify. Therefore, in difficult situation, we declare that the decision maker did not or could not really consent, that the consent was not ‘informed’ or ‘knowing’ or ‘voluntary.’ Rather than admit that the consent does not and could not justify the act, we denigrate the consent and, necessarily, the consenter as well.
“This is cheating; it is a subterfuge designed to hide our unease and to allow us to profess simultaneous commitment to values that often conflict.”
Supra note 4, at 460 (citing Robin West, Colloquy, Submission, Choice, and Ethics: A Rejoinder to Judge Posner, 99 Harv. L. Rev. 1449, 1449-50 (1986)(arguing that readers would not believe that people should be allowed to sell themselves into slavery or prostitution) and Guido Calabresi & Philip Bobbitt, Tragic Choices 195-199 (1978)).
[105] Id. at 457.
[106] Id. at 458.
[107] Research Match, supra note 72.
[108] Cognitive scientists have shown in numerous experiments that the “phrasing” and the way that information is presented can severely affect the decision the human subject makes. A classic example is the consent for becoming an organ donor, which was given to patients before surgery. By changing the wording from “Check here if you want to become an organ donor” to “Check here is you do NOT want to become an organ donor,” scientists were able to get a much higher consent rate. However, there is little data on this with regards to informed consent in human subject research; an area arguably more complex than checking a box to become or not become an organ donor. The Common Rule provides no such data on the issue with regards to clinical trial informed consent documents, much less whether or not an informed consent document is the most apt method for ensuring proper informed consent.
[109] Research Match, supra note 72 (citing Dr. Ross McKinney)
[110] Supra note 4, at 493
[111] Id. at 498
[112] How Closely, supra note 60.
[113] Isaac Asimov & Jason A. Shulman, Isaac Asimov’s Book of Science and Nature Questions 281 (Weidenfeld & Nicolson 1988).