The Ethics Police?: The Struggle to Make Human Research Safe

May 1, 2015

TV Show

Highlights

When it comes to medical research using human beings, who decides what's right? How do the U.S. institutional review boards work? What does "informed consent" mean when you need a law degree to understand the consent forms? How are clinical trails conducted overseas? Dr. Klitzman explores these troubling and complex ethical concerns.

Introduction

JOANNE MYERS: Good afternoon. I'm Joanne Myers, and on behalf of the Carnegie Council, I would like to welcome you to this Public Affairs Program.

As part of our series on global health, we are delighted to have back on this small little stage a big star, Dr. Robert Klitzman. I know some of you were here for his earlier discussions in this series, the first of which focused on Ebola and providing health care to the global poor in times of crisis, and the other on medical tourism, both fascinating discussions. If you missed either one of them, I suggest you go to our website at www.carnegiecouncil.org. You will find the full transcripts.

Today we are turning the tables. For the next 30 minutes or so, I will be interviewing Bob. This time the focus is on his recently published book, entitled The Ethics Police? The Struggle to Make Human Research Safe. In it he writes about medical research involving human participants. This is an issue that raises a host of ethical concerns pertaining to such values as dignity, bodily integrity, and legal issues, such as privacy, informed consent, deception, exploitation of vulnerable populations, withholding information about risk, and available treatment. All of these issues and more have concerned philosophers, lawyers, policymakers, scientists, and clinicians for many years.

Following our conversation, I will open the floor to a more interactive Q&A. At that time you can ask any questions that you feel Bob and I haven't talked about during our little discussion.

Discussion

JOANNE MYERS: We are going to begin, and as a way of beginning and laying the foundation, could you just tell us what medical ethics are?

ROBERT KLITZMAN: Medical ethics is the field that looks at the questions of what we should do with regard to health technology and biotechnology. We have many, many choices. We could keep people alive on machines when they are in a coma for decades. We could keep premature infants who are never going to leave the hospital alive in the hospital for years. We could spend a lot of money on certain treatments in research, for instance, that are going to affect very few people.

There are increasingly many questions. We have more and more technology. The question is when we should use it, on whom, and how. Should we spend a lot of money keeping people alive for decades when they are never going to leave the hospital and they are not aware of their surroundings? How should we regulate what pharmaceutical companies are doing? They have given us wonderful drugs that have improved many of our lives. On the other hand, they have been involved in scandals where they put people in experiments who end up dying, and people probably should not have been in the experiments.

Of course, we have many, many questions with genetics now. We can now sequence your whole genome for a few hundred dollars. That can potentially help tell us about your responses to different medications you may be on in the future or your likelihood of getting many diseases. But it can also tell you the likelihood that your kids may have autism in the future. It may tell you about small risks of disease that you have that may never come to fruition. We can tell you your risks of Alzheimer's disease, for instance.

There are many, many questions, of course, with that, and part of the problem is that information in your genome could be used against you if you apply for life insurance, disability insurance, long-term care insurance. So there are dilemmas there. Drug companies and most hospitals want to get as many people's genomes as possible in biobanks, which can help with science, but could potentially be used against you if the wrong people got it.

So these are all bioethical or medical ethical questions. Given all this technology, what do we do with it?

JOANNE MYERS: I think you have laid the foundation. Before we go into the meat of the discussion, in the introduction to your book, you talked a lot about why you particularly were interested in this topic. I wonder if you would share with us your personal story.

ROBERT KLITZMAN: Yes. When my father was sick, before he died, he was in the hospital. He got sick. It was in June. My mother and sister, who actually are here tonight, went through this experience with me. When he was in the hospital, he had leukemia, which is due to rising numbers of white blood cells in the body. The doctor basically said, "There's no effective treatment. We could just let nature take its course or we could do a little bit of an experiment and have this chemotherapy that we could try. There's a 50 percent chance that he might live for three to 18 months. What do you want to do?"

My father didn't know what to do. He was a realist. My mother said, "Let's let nature take its course." We all know the horrors of chemotherapy. I had recently finished my medical training. I was filled with hope for science and had seen doctors do great things and scientists do great things and believed in science. I thought we should go ahead and do the chemotherapy.

He did it, I think, in little part, because I urged. I thought that might have affected him. Then three months almost to the day, he died. The last three months were terrible. He was nauseous most of the time, etc.

So that haunted me and has continued to haunt me. Did I play a part in that? Did I push him to do this chemotherapy as an experiment? And I should say, when he died, the doctor came up to me and said, "Well, the experiment worked, but the patient died." His blood cell counts actually went back to normal, so the experiment worked. But the patient died.

I was horrified by that. I felt that we didn't—or at least I didn't—really understand the choices. Yet I do research. I routinely have people enter experiments. Yet I had never really seen how hard it is for someone being asked to be in an experiment, where you don't know what the outcome is, to do that.

I began to get interested in what the experience is like for patients who have to be in experiments. Are we minding the store sufficiently? We read about drug company scandals. We read about scandals that Facebook has been doing research on us. There are questions with Ebola. So there are lots of questions about what experiments should be done on us. I began to realize that my way of thinking about it, or the way I had been trained in medicine, was different from what the experience of patients and family members was. So I thought it was worth trying to understand this.

The more I looked into it, the more I saw that no one really understood it. I can go into the whole story. We have set up something called research ethics committees, or institutional review boards, or IRBs, in this country to decide what research is ethical enough, but there had been no study looking at different institutions, about how they make decisions. There have been problems, because there have been studies that they approve that later run into problems.

There was an article about a week ago in The New York Times about a study at the University of Minnesota, for instance. Some of you may have seen it. I think it was in the business section, a week ago Saturday. Dan Markingson was a patient at the University of Minnesota with schizophrenia. He went to the hospital, was given a doctor. The doctor was doing an experiment. He told Dan that he should be in the experiment. Dan couldn't leave the hospital, so he thought he had to be in the experiment. AstraZeneca was paying the doctor per patient, about $18,000 per patient entered into the study. They were low on patients. And Dan killed himself.

It's a long story, but again showing how in some cases things are done that shouldn't be done.

JOANNE MYERS: Are there ethical standards that guide research today?

ROBERT KLITZMAN: Yes.

JOANNE MYERS: I know after the Tuskegee study—the African American population that had syphilis—they instituted these IRBs. Maybe you could talk a little about that.

ROBERT KLITZMAN: Research ethics are fairly new. Research and medical research didn't exist, really, in the past. When Hippocrates wrote the Hippocratic Oath—there are important parts of that that we still live by. Hippocrates said, "Whatever I hear in the house of a patient I will keep to myself. I will protect patient confidentiality." Those things we continue. But medical research as we now know it didn't exist then. There weren't clinical trials. So it is a fairly new phenomenon.

It first came to attention and guidelines were first developed because of the Nazis. During the Holocaust the Nazis conducted experiments on people in concentration camps. They wanted to find out, for instance, how long people could survive in the cold before they died, which the Nazis wanted to know because they had a lot of soldiers on the front. So they took people in concentration camps and put them in the cold and saw how long it took for them to die. On the front they had soldiers who had lost limbs, and they wanted to see how they could sew the limbs back. So they took people in concentration camps, they severed their limbs, and they tried to sew them back. It didn't work. As a result of that, the Nuremberg tribunal was held and came up with a set of guidelines for what needed to be done in terms of ethics for research. Most importantly, there should be complete informed consent.

The problem is that we in this country didn't follow that. Through 1974, as you said, there was the Tuskegee syphilis study. This was started before World War II, when there was no effective treatment for syphilis. The U.S. government, supported by major institutions, decided to follow a group of African American men, mostly, who were semi-literate, with syphilis to see what the natural course of the disease would be in their bodies, how it would slowly take over their brains, etc.

The problem is that, after World War II, penicillin became available as a definitive treatment. The doctors and the hospitals—I think Hopkins, and there were some other major institutions; I'm not sure which ones—thought that we should not tell the men that there was now an effective treatment for their syphilis, because it would destroy the experiment. If we told them there was effective treatment and gave it to them, that would be the end of the experiment and we wouldn't get the answer, so we shouldn't tell them. Again, a major moral problem.

But as a result of that, in 1974, Congress passed the National Research Act that said that we should set up local institutional review boards or research ethics committees to oversee what science does. They should be local, protect local community values. We now have 5,000 of these committees. They operate basically behind closed doors, so no one really knows how they make decisions.

When I started to interview researchers about them, I said, "What do you think of these committees?" They all said they have tremendous power. When I interviewed the committees, I said, "Do you think you have power?" They said, "We don't have any power. We're just following the regulations."

But they do have quite a bit of power, in ways that I could talk about.

As I said, the problem is, on the one hand, there are studies they approve that then run into problems, and there are studies that they block. One of the other problems is that when the National Research Act was passed in 1974, there was much less research than there is now. The budget at NIH [National Institutes of Health] has gone up many times over since 1974. It has just zoomed. We have not kept up. As science has advanced, we have not kept up in our understanding of the ethical, legal, and social and cultural issues involved.

In 1974, most research was done by a doctor in his clinic—usually "his." He would take the patients he had known for many years, he would say, "Here's a new treatment I want to try," and he would try it. Now, to get research done, to get FDA [Food and Drug Administration] approval on a new drug, you need to go to 60 hospitals to get 3,000 patients. So you have 50, 60 hospitals and you have 50, 60 different committees. One committee will say, "This project looks fine." One says, "No way." One says, "Change this." One says, "Change that."

So these committees have gotten in the way of doing a lot of science.

JOANNE MYERS: Do they have that much influence?

ROBERT KLITZMAN: They do, yes.

President Obama, in 2011, released proposals for changing the system of oversight and changing these committees. The back-story there is that Rahm Emanuel, who is now the mayor of Chicago, has a brother named Ezekiel Emanuel, who is an oncologist who works in bioethics. [Editor's note: For more on Ezekiel Emanuel check out his May 2014 Ethics Matter talk.] He then said to Obama, "Look, we need to change these committees." Those proposals came out in 2011. Nothing has really happened with them yet.

In December, the National Institutes of Health said, "We need to centralize these committees."

So there is a lot of stuff going on in terms of policy, but the problem, I would say, is that whether we centralize it or not will address some of the issues, but the broader issue, and the reason this is important, is there are major issues that we all face as a country. Should there be limits on what scientists do? If so, where should the limits be? Who should decide? If I say, "Here's a new drug, and I think it's going to help people," is it too risky? Is it not risky enough? How much informed consent is enough? If someone goes into a study in a hospital now, the informed consent is often 45 pages, 50 pages, filled with dense legalese, and people don't understand it.

More broadly, I would say that what I learned from doing this is that these committees are a prism through which to look at a whole range of issues that we face today that are important issues about science and ethics.

To give you an example, every week we all scroll down and click "I accept" online. Notice it doesn't say "Read and agree." It says "Scroll down." They don't even want you to read it. But those are legal documents. So we are all agreeing to things that we don't understand. Informed consent legalese has taken over the things we do, our transactions. I think this is a major problem. Just as informed consent is not understood by patients in studies, so too we all sign informed consents every time we go on Google or Facebook or Amazon. I think this is a major problem.

I think, policy-wise, there should be a policy, since the Internet is basically a public utility at this point—to interact with government, if you want to contact the IRS [Internal Revenue Service] or New York City government, you have to have Internet access. There should be a law. We should not be forced as consumers to agree to contracts that we don't understand.

JOANNE MYERS: What should be the role of the IRB?

ROBERT KLITZMAN: We need to rethink the system. The IRBs do a lot of great work, but we need to reform the system. We need to make sure that informed consent forms are understood by patients, for instance. There could be interactive network systems, where it could say, "This study will look at your DNA. Click here if you don't understand DNA." I think we need to think creatively about that.

I have recently been interviewing research ethics committees in the developing world. They say sometimes it takes two days to get informed consent from someone, whereas here—in another study I did, some patients told me they were consented when they were on the gurney about to go into the operating room. They were told, "By the way, here's a study, blah, blah, blah. Just sign here." They didn't understand it. There are many examples of scandals that have recently happened by people entering studies and not understanding them.

JOANNE MYERS: You brought up developing countries. What are some of the challenges that you face? Drug companies often support it, and the NGOs want the money. Maybe you could elaborate a little bit.

ROBERT KLITZMAN: This is a big problem. Most biomedical research today is funded by entities in the United States. Most, though, is funded not by the National Institutes of Health or by government, but it's funded by industry, by pharma, the pharmaceutical industry, and most of that is done in the developing world. Most drug company studies now are not done in New York or New Jersey; they are done in India, China, Vietnam, Peru. Think about it. Just as our cell phones are not made here—they are made in India or Vietnam—so too medical research has been put offshore. It is much cheaper and there is much less regulation. If I am a drug company, to do a complicated study, it's cheaper to do it in Peru than it is to do it in Peoria or New York.

But as a result, there are problems. You often have illiterate populations. Some people—their language was never a written language, for instance. In Xhosa, which is one of the major languages in South Africa, there is a word for "cure." That's the only word. There is no word for "treatment," for "experimental treatment." So if you say, "I'm going to give you this drug," the only word for "drug" is the word for "cure." If you say to someone, "I'm going to give you this drug. It's an experimental drug. We don't know if it's going to work," what they hear is, "You're giving me the cure," for example.

One other scandal, for instance. Pfizer was involved in a drug for meningitis, a brain infection, in Nigeria, with children, and a number of children died. The government of Nigeria decided to sue Pfizer. The attorney general in Nigeria was leading the effort. When WikiLeaks occurred a few years ago with Julian Assange, one of the things that came out was that Pfizer decided to go after the attorney general and was getting the State Department involved to try to smear the attorney general. They were trying to dig up dirt on him, make up stuff, put it in the newspapers to embarrass him because he was going after Pfizer because these kids had died.

Again, these are major problems. But these are difficult problems, or some of these are. When I interviewed people in the developing world about what are the major ethical problems they face on these committees, these research ethics committees, the number-one problem, which surprised me, was money. People said, "You pay people $500 or $1,000 to be in a study in the United States. If you give people here $1,000, that's a year of salary, and you will be coercing people. They will be entering the study for the wrong reasons. But if we let you have our people cheap, we're letting you exploit our people, giving you our people cheap. So what do we do?"

These are the kinds of problems that come up. I would say the answer is that there could be other things that could be done, gifts in kind. They could staff a clinic for a year. They could build a clinic, things like that. But again, these are difficult issues.

I was in one developing-world country a few years ago, where there was no road. You had to take a boat to get there. The people said, "Do you want to see our hospital?" I said, "You have a hospital?" They said, "Yes." So they bushwhacked through the jungle and they came to this building. It was a cinderblock building. They said, "A Western company built this as our hospital." They said they would build a hospital. They didn't say they would equip it or that they would staff it. They just built this cinderblock shell.

These are symptoms of globalization, in a way, in that you have wealthy companies coming to very poor areas. We all have the same biology basically. It's cheaper to do studies in the developing world, but that creates lots and lots of moral problems.

JOANNE MYERS: Do these NGOs in the developing world have any recourse, then, once they find out that there is very little money or that they are manipulating the study?

ROBERT KLITZMAN: These committees are—you hope that they are—

JOANNE MYERS: But these are only based here in the United States.

ROBERT KLITZMAN: These are, but the more recent interviews I have been doing are with these committees abroad. You hope they have the wherewithal, but they don't. Part of the problem is that there are conflicts of interest. These committees abroad will say to me, "Company X, a big drug company, is going to give us $3 million. That's a huge amount of money. We have one broken fax machine. We barely have Internet service here. Sometimes we have Internet service once a week on Tuesday for a few hours. That's the Internet service. Now they are going to give us $3 million to do a study. Of course we're going to say yes."

JOANNE MYERS: There are no international standards, then.

ROBERT KLITZMAN: There is no international law. The WHO [World Health Organization] has guidelines. There have been guidelines that have been put out. But they are weak at best, and a lot of places don't follow them. Then there are other sets of problems. In a lot of countries you have the Ministry of Health that has to approve studies, and there ends up being corruption. They will say, "We need to approve it. That means we're going to get 20 percent of the money for our administrative needs." The people who are supposed to be overseeing it often have a vested interest in just getting money themselves.

Emerging countries vary widely. This is not the case with every country, but a lot of countries face these kinds of problems.

JOANNE MYERS: Maybe you can talk a little bit about the Ebola vaccine that was available here, but not necessarily available to—

ROBERT KLITZMAN: Right. We all go along our merry way. We try not to get sick. We sometimes end up in the hospital. But then, of course, because we live in a smaller world, medical problems that come up elsewhere often become our problems, too, as was the case with Ebola; before that, of course, HIV, SARS, etc., etc.

Ebola, I think, was an interesting moment for a few reasons. One is MSF, Médecins Sans Frontières, or Doctors Without Borders, starting a little over a year ago, was saying it's a problem. WHO, for the first few months, told MSF, "You're making a big thing out of nothing." It was not until August—so five months later—that WHO said, "Yes, you're right, MSF. This is a big problem." [For more on MSF and the Ebola epidemic in West Africa, don't miss Robert Klitzman's discussion with former MSF executive Unni Karunakara in February, 2015.]

We, however, didn't really pay much attention, until a patient showed up here with Ebola, if you remember. Then there was a big uproar.

There are a number of interesting points about it. One is that there were about five doses of a drug, ZMapp, which is made in tobacco leaves—that's another interesting story. But no one had ever really heard of the drug until two people got it. Both were Americans. Of course, the people affected with Ebola were 99.99 percent African. But the handful of people who got the treatment were basically Westerners. Again, no transparency. It was not clear at first how they got it and why they got it.

More recently, there are problems. For instance, a clinical trial was finished about two months ago that showed that a drug called favipiravir reduces the risk of death from Ebola from, say, 40 percent to 10 percent or 40 percent to 20 percent. It reduces it quite a bit, but not 100 percent. The question is now, do we roll this out? In other words, given that you can still die even if you are on the drug, it just reduces your risk of death, there are questions—now if we have another drug—let's say another company says, "This is my new drug"—should we compare this against this? Should we roll this out to everyone? How do we decide?

Again, there are a lot of ethical issues that come up. If we have a vaccine, should we try it against a placebo? Should we not? At what point is something good enough to roll out in prime time?

Then, of course, in this country there were a lot of medical ethical issues. Various people who did not have Ebola had just volunteered to go work with people with Ebola, came back here and were treated as if they were prisoners, basically, which is unfortunate.

JOANNE MYERS: But in the end, how do you balance research and clinical care? Is there an easy way to do this, with oversight or not oversight?

ROBERT KLITZMAN: The problem is that the way the system is set up now, it's thought that these are easy questions. Some of these are difficult questions. Ethics is not always straightforward—there is not always a straightforward, easy answer. You need people to discuss it, and we need to pay more attention to these issues, I think. For some of these issues, there is not consensus. We need to get consensus. You want to promote science, but you want to protect people's rights and not do experiments that they don't know about, not have companies take advantage of them.

I think that IRBs, for instance, don't take into account, or up until recently—when they were formed in the 1970s, they didn't take into account whether or not the research was making money from a drug company off of the patients. It wasn't until 1999—there was a patient named Jesse Gelsinger at the University of Pennsylvania, who was a healthy young man who volunteered to be in a study and died in the study. It was a gene transfer experiment. The researcher knew there were problems with it—Jesse was the second patient. The researcher knew there were problems with the first patient, but he pushed Jesse into the study. It turns out that he owned part of the company making the drug, and so he wanted patients in his study. Even though there were problems, he went ahead and pushed patients into his study.

That was approved by the IRB, because the IRB at that point didn't really look at issues of conflicts of interest. It is an evolving system. The point is that there are ways that we could do this, but we need to pay attention to it. The problem is these committees operate behind closed doors. They generally refuse to be studied. They don't want people coming in and observing them, how they make decisions. That has been a problem.

So we need to know more about what the issues are. We need more transparency. I think as a society we need to be more aware of these issues. I think these are a prism for larger issues of how much we control what science does, what scientists do. We have been trained to think of science as a very pure enterprise, and for the most part, it is. But there are problems. We live on a planet with global warming. Some people think global warming/climate change doesn't exist.

Science is giving us more and more challenges. We are living in a golden age of science, in many ways. If you think how far we have come in the past 50 years with technology, with the Internet, with having cell phones, with biology, with understanding genetics, it's extraordinary. But we have not kept up with people understanding the science, people understanding the ethical issues, the legal issues.

JOANNE MYERS: Before we open it up, first of all, I want to thank you. You talked about closed doors. I want to thank you for opening the doors.

But I'm just curious. Why are there 4,000 institutional review boards? It's so many.

ROBERT KLITZMAN: Every institution has one, and major medical centers have several. We at Columbia have six of them, for instance.

JOANNE MYERS: That may be one way of changing the system—

ROBERT KLITZMAN: That's why Obama said let's centralize. There are pros and cons, I think.

Questions

QUESTION: My name is Jason Abrams.

I'm very curious about the issue of consent. You mentioned the issue of informed consent. Are there other issues involving consent that you didn't get to touch on in your remarks? Is there any experimentation going on today without some form of consent?

ROBERT KLITZMAN: For the Facebook study—as some of you may know, over the summer, two researchers at Cornell in Ithaca published a study they did with Facebook. Facebook took 700,000 users and did an experiment on them. With half of them they only showed the positive posts from their friends: "I got a new dog." "My sister just gave birth." With the other half, they only showed negative posts: "My dog died." "My aunt died." Facebook found that people who only got positive posts were more likely to post positive things, and the people who got negative things only posted more negative things.

Everyone was a guinea pig, basically. They were able to affect people's moods. Of course, we know that people's moods on Facebook can have bad effects. There is cyber-bullying. Some teenage girls kill themselves because of stuff they have gotten on the Internet.

Facebook at first said, "Well, when you scroll down and click 'I accept,' it said We may do research on you." It turns out Facebook added that only afterwards. The 700,000 people didn't know they were being in studies. I would argue now, just because it says, "When you scroll down and click 'I accept,' we may do research on you," that is insufficient. You don't know what kind of research, what they are doing. You are not given a separate choice. Your choice is, use Facebook or not, arguably. But I would argue that it's not enough.

We try to get informed consent, but the quality of the informed consent varies, both in clinical medicine and in research, from "Just sign this" on the gurney. Another example was, last year there was a big scandal. There was a study of giving different levels of oxygen to premature infants. The study was called the SUPPORT study. When infants are born prematurely, they need oxygen. We know that if we give them too little oxygen, they die, but if we give them a lot of oxygen, some of them end up going blind. Ninety-three percent of doctors give the high level and 7 percent give the low level. But there is not agreement.

When women were entering delivery and were having problems in labor, they would "consent" them for their kids to get either high levels of oxygen or low levels of oxygen. This ended up being a bit of a furor, because it turns out that high oxygen is better. It turns out that—guess what?—low-oxygen kids might not go blind, but they die, versus with high levels they live, but they may go blind. But the consent form said, "The purpose of this study is to reduce the risk of death and reduce the risk of blindness," period. It didn't say that you are either going to have one or the other; either you are going to have an increased risk of death and a decreased risk of blindness or a decreased risk of blindness and an increased risk of death. So that is a bad informed consent.

So the quality of informed consent varies.

QUESTION: James Starkman. Thank you for a fascinating discussion.

Gilead Sciences, as you know, developed an apparently very effective hepatitis C drug. Merck is working on one, which may or may not be as effective. But in the meanwhile, hep C is a very widespread and lethal disease, and the cost of a single treatment, I think I read, is $80,000 to $100,000. They have a monopoly for the time being.

What are your views on the intersection of medical capitalism, which permits the development of those drugs, and what should be, if any, regulation of pricing in that kind of a situation, which basically prices most patients out of the market?

ROBERT KLITZMAN: Right. The issue is, for hep C, there was a new drug introduced last year, and it costs about $80,000 a patient per year. I think that's immoral. I think that's ridiculous. At first it was higher. It's only because another company came along and said, "Well, we're only going to charge"—I think they wanted $100,000 or $120,000—and another company came along and said, "We're going to do ours for $60,000." Then the company came back and said, "Okay, it will be $80,000."

I think that's ridiculous. The issue is that drug companies and insurance companies have sort of bought out the political process, in a way. That is one of the problems. It ends up being our tax dollars that end up covering people with Medicaid and Medicare, often, who are then being charged $80,000. So it ends up being us who pay.

Drug companies argue that they need the money for research and development. It turns out that drug companies spend much more money on marketing than they do on research and development. We all see these ads on TV—"take XYZ drug." Those are expensive. I think the system is screwy.

For this study for the book, I interviewed IRB chairs at a number of universities who said, "Our researchers are not getting government grants like they used to. They are not getting NIH grants, because they are harder." They want to do industry grants. The industry grants often are "me-too" drugs. A drug company will have a patent on, say—this is the drug and it's going to go off patent. So the company will make a little tweak in the drug and say, "This is the new drug. We want to extend the patent on this." The researchers will take patients who are doing well on their current drugs, switch them to the new drug to see how well it works, and the institution gets a couple of million dollars. This is helping the company. The primary purpose is not to help patients.

But this is the kind of problem that—if only universities got a few million dollars of research money to do something that actually has more public health benefit, I think it would be great.

So I think it is partly a political process to try to draw lines on the maximum amount of costs drug companies are allowed to have on their products.

QUESTIONER: I assume Gilead spent hundreds of millions of dollars, if not a billion, developing this drug. I would assume that.

ROBERT KLITZMAN: It costs a lot of money to develop a drug, no question. The problem is that—I forget the number of lobbyists per congressman that the drug industry has, but it's huge. They have a huge lobbying effort. I think that is part of it.

QUESTION: My name is Greg Kerr. I'm the medical director of critical care services at Weill Cornell Medical College, New York Presbyterian Hospital.

First of all, I want to say thank you, Robert, for leading this conversation. It's a very important area, a very timely conversation. It is important that we have this.

Big Pharma has a difficult time enrolling minority patients and participants into their studies. A lot of it has to do with the history of Tuskegee and Johns Hopkins and a general distrust of the whole research process.

Without being coercive and ensuring that there is truly informed consent, any idea on how we can increase the numbers of participants in pharmaceutical studies? Without increasing the numbers of minorities in pharmaceutical studies, we don't know how those studies will affect minority patients.

ROBERT KLITZMAN: Right. Terrific question, extremely important issue.

I think there needs to be more trust built. The distrust that has been there needs to be overcome. I think that takes time and work. For medical institutions as a whole, often there is a lot of distrust on the part of minority populations, looking at medical institutions that often have not had the interests of minority populations as high in their priorities as they should. To show these are beneficial—I think people are educable. If you say, "Clearly, these are the benefits," I think you can instruct people or inspire people to see that and try to overcome the distrust that has been there. The problem is, there has been distrust.

The study that he mentions at Hopkins was the Kennedy Krieger study, I think, with the lead paint. It was known that kids who eat lead paint get neurologic problems. Hopkins did a study and they wanted to get rid of the lead paint in homes and see if it decreased the number of neurologic problems. What they did was they said, "To completely abate the lead is going to cost this much money, so why don't we do two lesser levels? We will leave some of the lead there, and we will just get rid of some of the lead." They put families into these different homes, and some of the kids ended up with neurologic problems.

Of course, if they had said, "We're just going to get rid of all the lead"—they knew lead was a problem and they put kids in the homes that still had lead, which is partly what fueled the mistrust. They didn't think through how an outsider would look at the ethical issues involved with this.

I think pharma needs to do a better job at being as ethical as they could be. I think other medical institutions, too, need to really show how they do have the best interests of patients at heart.

JOANNE MYERS: Why doesn't anyone ever go after the drug companies in any of this? You mentioned the hospitals. They go after different doctors. But they never go after the drug companies.

ROBERT KLITZMAN: This is partly why the informed consent forms are so long. They are legalese, because they basically say, "We are covering ourselves," the drug companies. When you scroll down and click "I accept," most people don't read it. If you try to read it, you can't understand it. They only give you three lines at a time. The thing is 40 pages long, but you can see three sentences—no sentence is only three lines long. They are all paragraphs. So if you try to do it, you can't. They are trying cover themselves.

JOANNE MYERS: But shouldn't there maybe be an institutional review board for drug companies? Is that part of that?

ROBERT KLITZMAN: Yes. Drug companies need to get approval of institutional review boards.

JOANNE MYERS: Are they separate, though, from looking at hospitals or doctors?

ROBERT KLITZMAN: Some of them are the same. I will bridge that in a second.

For instance, IRBs say informed consent should not be over an eighth-grade level. Almost all universities say that. Someone did a study looking at the grade level of informed consent forms, and 92 percent were above eighth-grade level. They are not doing their job to make the informed consents understandable.

They are understaffed. They are underfunded by institutions. I think the problem is they get pushback. Pharma wants these to be legal contracts. A legal contract is a very complicated thing, especially when you are dealing with—there may be medical injury. Who is going to pay for problems that occur? I think that is unfortunate. That creates distrust, though. I am not here to take care of you; I am here to get you to sign a legal document first. People feel they may be had.

QUESTION: Good evening, Dr. Klitzman. Youssef Bahammi is my name.

What do you think about the fact that the politics influence a lot regarding this topic of the ethics police? There are some countries, some cultures where the first vocation is going to be the reverence, because they are providence states and in some other countries, it's going to be the investigation, the first vocation, because they are police states, and that could influence the context of action of the human being and his growth from the young age. I would wonder if you adhere to this thesis and antithesis.

ROBERT KLITZMAN: Let me understand your question. It is in some countries people become reverends and in other places —

QUESTIONER: No, no, no. I was saying it's a matter of scales of freedoms that influence maybe this ethics police topic, in terms of political regimes, in terms of how the social environment or religion maybe could play a role on it.

What do you think about this, given your experience before with the U.S. Department of Defense, if there really is a political impact sometimes on the research?

ROBERT KLITZMAN: Yes. There are a lot of politics. There are politics and there are other cultural issues. In this country the largest political issues are the degree of influence that drug companies have in policy, so they can charge $80,000 for a drug. There is local politics in what happens with science. Within an institution, these committees may say, "Look, my friend Joe is a doctor down the hall. He needs this grant from this drug company, so we had better approve it." There is that kind of politics.

In other countries, there are other kinds of politics. In the developing world often, there is a Ministry of Health. They may say, "Our priority is research should be what we think is important."

One of the heads of a research ethics committee I interviewed in the developing world said, "In our country there is an anti-Yunkee feeling." I said, "Anti-Yunkee?" He was saying "anti-Yankee." He basically was saying that, because the United States had been involved doing things it shouldn't have been years ago in this country politically, the Ministry of Health now says that if there is a study being sponsored by the United States, you are going to give it a hard time. If it is sponsored by Great Britain, it will pass more readily.

Similarly, unfortunately, politics enters in—you remember the president of South Africa, Mbeki, for many years said HIV does not cause AIDS. He was against funding research on HIV prevention. He was against many, many things, and HIV took off. So you have those kinds of political influences also.

But even in this country, when we, under President Bush—and I think still part of the president's emergency plan for AIDS relief, where we are giving billions of dollars in HIV drugs to countries in sub-Saharan Africa with a lot of HIV—one-third of the money, I believe, or a certain percent, has to go to abstinence-only religious-based education, which we know doesn't work. Telling people, "Just don't have sex. You won't get HIV. Just abstain," we know that doesn't work. But for political and religious reasons, that was built into the plan that was funded.

So in many ways, politics does enter these areas.

QUESTION: Marlin Mattson, from Weill Cornell Medical College, a physician in psychiatry. Thank you for a wonderful overview. It was very comprehensive.

My comment really has to do with the consenting process. What kind of monitoring actually occurs for IRBs or maybe even in the research endeavor, looking at studies that really look at the consenting process to see whether it is adequate or not? I know that some departments actually do monitor it to ensure that, in fact, different critical aspects, like the consenting process, are meeting the requirements that we really want.

ROBERT KLITZMAN: There is not much. It is a problem. Ideally, if someone is consenting into a study, particularly, say, an older person who may have some cognitive problems or someone with mental illness or someone who may not have the educational level, it would be great to have someone observe, a third person. If I am getting money from a drug company and I want you in my study, I will say whatever I want, perhaps, to get you in my study, whereas if there is an independent third person to make sure you understand it, that would be great.

Quote: "There's not money for that." Well, of course, there are billions of dollars in medical research money. So I think money could be made available, and I think it's important.

In terms of the research ethics committees, the IRBs, there is very little monitoring. There was a Roman satirist, Juvenal, who once wrote, "Who will guard the guards?" They are supposed to be watching science. Well, who is watching them? There is the Office of Human Research Protections, OHRP, but they just come in—if there is a scandal, they will come in. They are underfunded as well.

So there is not much. Part of the reason I wrote this is to say this is an important area. This is billions of dollars, and there needs to be some kind of larger thought, oversight about what they should be doing, what we should be doing with regard to informed consent and other things. I think that people shouldn't be forced to sign informed consent they don't understand.

QUESTION: Thanks. Jessica Kirk, from Siegel+Gale.

I have a two-part question. This issue about trust and the many IRBs—is there a difference in your research between community hospitals and academic medical centers that is worth commenting on?

Then I wanted to know—you have articulated so many of the forces behind why these are 40-plus pages, these informed consents. Besides voices like yours articulating it in this book, who else is sort of on the side of more clarity for the patients and more transparency?

ROBERT KLITZMAN: Those are great questions. There have been proposals. One of the elements in Obama's 2011 set of proposals was that there be short summary statements and then the longer informed consent. I think that still is something that needs to be explored. The problem is a lot of institutions don't want that, because you need the whole thing. If there is this short summary statement, there are going to be things that are excluded from that. The fear is, if you just say it is really just summarizing this page, but it is really a complicated study—I think there just needs to be more time.

I should say another whole problem, which you allude to, is that when these committees were set up by Congress as a result of the National Research Act of 1974, they said they need five members. They need to have a non-scientific member; an unaffiliated member, someone not affiliated with the institution; an outsider; and then three scientists. Institutions have had problems getting someone who is a non-scientist, unaffiliated member who is willing to do this, to read all these forms. So they said it could just be one person. It has been called the community member.

Usually these committees have one person who is usually the one woman of color in a roomful of white male doctors. I should say the committees are now mostly male overall, but they are all white and they are almost all doctors. So you have this one community member, who feels unempowered usually, has trouble really—they are not schooled to understand the complexities of some of these things. Some wealthy institutions will have someone whose only job, if they have six IRBs, is to help the community member understand what's going on and give them support. Other institutions can't afford that.

I think they should have—why not have 10 percent of the committees be unaffiliated and 10 percent be non-scientific? I think that is a simple change that would help a lot.

The other part of the question—one was community hospitals and the other was—

QUESTIONER: Speaking about all these institutions that have IRBs, is it a preponderance of academic medical centers where this is happening, because that is where the research is happening, or if you are talking about more rare diseases where they have to go across 50 or 60 institutions—

ROBERT KLITZMAN: Right. The story is that every institution—any experiment on a human being needs to have approval from one of these committees. Most institutions have their own committee. Sometimes some smaller NGOs may farm out to a committee. Actually, there is another phenomenon, which is for-profit IRBs. The largest is owned by a venture capital company. Another one is on sale now for $220 million. This is a committee that will—mostly drug companies will go there and say, "We'll pay you money to approve our study." So there is a little bit of a conflict of interest. I think conflicts of interest are everywhere. They just have to be managed.

So smaller institutions may do that, or smaller institutions may—a lot of community hospitals at this point are affiliated with some other medical system and will go to some other IRB in the system.

Why is there not more support for some of the things I am saying? A lot of patient advocacy groups—patient advocacy groups are great—a lot of them now partner with pharma to get studies done. For a lot of them, a major interest of theirs is to get support for research on their disease, whether the Parkinson's Disease Foundation or the Alzheimer's Disease Foundation that wants money for Alzheimer's research. They often work with pharma. So that has been an issue.

I think a lot of patient advocates push for this, but it has not gone as far as it should. I think the reason is because, whereas ethically these are meant to inform patients about what they are getting into, in the minds of institutions they are legal documents and for drug companies they are legal documents. If something bad happens, they point to it and say, "Well, we told you about it. On page 38, paragraph 5, line 3, it said you may develop cancer." That's why it is so hard.

QUESTION: Mike Koenig, Long Island University.

Many years ago, in the early 1970s, I was head librarian for Pfizer Research, back when Pfizer was run by scientists, not by accountants. I got a phone call from the president's office. President Laubach was requested in a couple of weeks to testify before Senator Kennedy's subcommittee on the ethics of testing drugs on prison inmates, a practice that was ended soon afterward, mostly because of concern with, what does informed consent mean when you were incarcerated without your consent?

But the data was fascinating. People who volunteered for those trials had a dramatically lower recidivism rate when they got out of jail. The first thing you learn in Statistics 101 is that correlation is not causation. It might be just because people with a social conscience are more likely to volunteer for drug trials and less likely to be repeat offenders when they get out. But the difference was so dramatic that most people suspected that there was, in addition to new information being generated, a societal beneficial role for testing drugs on prison inmates.

My question is, is that possibility being reinvestigated at all, and how might it be handled if that were to be reintroduced?

ROBERT KLITZMAN: It's a complicated question. The problem is that for a number of years, there were some very unethical studies that were conducted on "inmate populations." At the Willowbrook School, for instance, they did experiments on people to try to give them hepatitis, as I recall—"There they are in an institution; let's just experiment on them." There was a Jewish home for the aged, and there were experiments, if I recall correctly, that were done there, because, "They're there; they are not going to leave. We'll just experiment on them. They don't understand it. It's okay. It's for their good."

Because there were violations ethically like that, there became stiffer guidelines for doing research on prisoners. You can do research on prisoners, but there need to be a few things. One, it has to be something that is going to benefit the prisoners or the prison population. The problem is, you don't want research that—they are a useful population. After all, that's sort of what the Nazis did: "Here's a population of people. Let's just experiment on them. They can't really say no. They are just here." So they can't give full informed consent.

I would argue, though, that as a result of these higher regulations—you can do it, but they are high regulations. The high regulations are various prison representatives, etc. But as a result, we know much less about prisoners than we should. So there needs to be a better balance.

I think you don't want just research being done on them because, hey, it's convenient; they are there and they have to do it, and they may think it will get them out early.

I think the reason that prisoners who volunteered for research had less recidivism, as you say, could be because either, one, maybe they are, as you say, more—you didn't use the word "altruistic," but they are more socially minded, I think is what you said. Maybe they are better educated, for instance. We know that better educated people are more likely to participate in studies, for instance, or it seems that way often. Maybe they are better educated. Therefore they are going to be in studies and they may get out.

Of course, you would also want to look at studies of what. Was it for mental illness? Was it for cancer? Was it psychosocial things?

So I think you are right that, because of excesses, we have developed a lot of guidelines that may now be too restrictive. But getting the right balance is what we need to do.

QUESTION: Jamie Levey, from the European Huntington's Disease Network. Thank you for the great discussion. I'm looking forward to reading your book because I know I will find a lot of solutions to the questions I have.

I had the pleasure of recently observing some IRBs, and what I noticed was, because of the composition, being scientists or doctors and not a lot of community representatives or patient advocates, they focus on the science and they focus on design of the protocol and safety and adverse events, which, of course, are very important. But they really don't look at the comprehensiveness of the informed consent.

It seems such a simple fix, and I still don't understand why they can't make them comprehensible to the average person. I read a lot of them as part of my work. I never read one that actually can flow, that you can just read as a lay person. Why is that so complicated?

ROBERT KLITZMAN: I think more people need to speak up and say they need to be comprehensible. I think there is no reason they can't be. The reasons are, one, because institutions want them to be legal contracts; two, as I said, drug companies often want them to be legal contracts; three, the poor community person—it would be great if they would speak up and say, "No one can understand what this is saying"—feels that the doctors must know what they are doing. "I'm the poor outside person and here is this roomful of white folks who are telling us what to do. They must know."

I think there is actually a tension between—IRBs say they should be shorter and comprehensible. But shorter could be longer. Rather than say, "I'm going to do an MRI on you," if you had to explain what an MRI is, that takes a paragraph. If I say, "We are going to put your blood in a biobank," to say what a biobank is, that's going to take a page.

I think the notion that you could sort of do this in five minutes is ludicrous. I think people should take it home. I think that needs to be included more. I think more patient advocates and others have to say this is unacceptable.

JOANNE MYERS: Thank you once again. Advancing science and protecting humans is a big challenge. Thank you.

ROBERT KLITZMAN: Thank you for having me.

You may also like

Fire in the Blood

SEP 7, 2013 Article

Ethics on Film: Discussion of "Fire in the Blood"

With the tagline "Medicine, Monopoly, Malice," this powerful documentary tells how Western drug companies fought to keep discounted AIDS medications from reaching HIV-positive citizens of ...

Not translated

This content has not yet been translated into your language. You can request a translation by clicking the button below.

Request Translation