IRBs come in many types, focusing on dramatically different categories of research from Phase I drug research to studies focused solely on education and social science. It may seem at first that the latter entails simpler ethical problems than, say, multisyllabic first-in-human drug trials for cancer; but very low-risk nonmedical research presents its own challenges for the IRB.
As we know from the Belmont Report, one of our basic ethical responsibilities is to assess the anticipated risks and benefits of the project before us. The risks and benefits may adhere to different persons—the participants in the research project may experience the risks, while the benefits could come later to a different set of persons. Sometimes the benefit may be finding out that an intervention doesn’t work. But one way or another, the risks should not exceed the hoped-for benefits; we should be willing to put participants at more risk to find a cure for cancer than for the common cold.
From this perspective, low-risk research can raise political problems within the institution. It is one thing to say to a researcher, "Sorry, this is an elegant study of an important issue, but the risks to participants are simply too high." They may be angry and frustrated, but at least you have not impugned the quality of their work.
Contrast that with, "The risks to participants are extremely low, but this is such a poorly designed study that the likely benefits are even lower." Say something along those lines too many times, and you are going to reap some real political trouble.
Let me give an example of a research protocol submitted to an IRB on which I served (not at my current institution). The PI, a graduate student from the college of education, was tackling an important topic: the effect of class size on student experience. His plan was to take a small group of middle school students who all had the same two classes, a large English class with Ms. X and a small math class with Mr. Y. He would interview the students and ask them which class they paid better attention in.
He showed no awareness of the obvious confounders: different teachers, different class subjects, different classmates, different times of day…and if by some miracle he had actually discovered a piece of generalizable knowledge, his application showed that he lacked the ability to turn it into a publishable paper.
I objected, as the likely benefit to anyone was roughly zilch, and that’s where things got interesting. Turning down an application outright is not a pleasant thing to do, especially when the PI (or the supervisor) is in your institution. The other IRB members remonstrated with me. Sure, they agreed, this wasn’t stellar research, but this research posed virtually no risk, so who cares if it also offered little benefit? In order to defend my No vote, I had to point out that there was some small risk of inconvenience and wasted time, and I had to be ever more vigorous and uncomplimentary in my description of the unlikely benefits. Repeat this experience on a monthly basis, and pretty soon one college or department believes that the IRB is “gunning” for it.
This seems to be a problem especially common in schools of education, although certainly not all. In order to move up the ladder, teachers and administrators need graduate degrees, and a graduate degree typically requires a component of original research. The students have little background in research design—and even less interest—but it has to be done. Typically, they do the research in the schools in which they are currently teachers or administrators, raising additional issues of student confidentiality.
Unfortunately, our commitment to protect the interests of human subjects can turn the IRB into a defender of academic rigor, a role for which it may not be best suited. Nonetheless, especially where recruited subjects are children, I believe the IRB must stand its ground and refuse to approve protocols of no possible benefit, even if it is low-risk research.
Dena S. Davis, JD, PhD, has taught at Central Michigan University, Cleveland State University, and Lehigh University. She has served on five IRBs.
Guest contributors are valued members of our community willing to share their insights. The views expressed in their posts do not necessarily reflect those of PRIM&R or its employees.
 D.W. Miller, “The Black Hole of Education Research,” Chronicle of Higher Education, August 6, 1999.
Thank you for your post. I especially appreciated the observation that the students have received little to know training in research design, yet they are required to do research. I agree! Too often, IRB staff and members shrug their shoulders and say: “What can we do?” Thank you for proposing some action, especially with regards to research involving school children.
I just realized autocorrect changed “no” to “know.” Yes, I do “no” better than that!
Unless one knows there is absolutely no possibility one can learn something from research that clearly is minimal risk, IRBs and IRB reviewers as enforcers of “science” spend misplaced time and effort not protecting human subjects or minimizing research risks. Could the proposed project as a pilot or exploratory study not use qualitative methods to ask students whether they feel class size affects their performance? Also, many schools have a pathway for proposals like the one cited that are either “research” requirements or “practice” with low likelihood of surviving peer review.
Thank you for tackling this sometimes uncomfortable topic. At our institution we do often require applicants (mostly students, but not all) to make revisions to improve the quality – and hopefully the likely benefit – of the project before approval. Often this is in the form of both stipulations and suggestions with direction to “meet with your adviser to discuss” the needed revisions. We view this as part of our role to educate students and advisers in a more collaborative way, rather than only presenting barriers to human subjects research.
One potential benefit you have left out – the educational benefit to students as they engage in the research process.